sha
stringlengths 40
40
| text
stringlengths 1
13.4M
| id
stringlengths 2
117
| tags
sequencelengths 1
7.91k
| created_at
stringlengths 25
25
| metadata
stringlengths 2
875k
| last_modified
stringlengths 25
25
| arxiv
sequencelengths 0
25
| languages
sequencelengths 0
7.91k
| tags_str
stringlengths 17
159k
| text_str
stringlengths 1
447k
| text_lists
sequencelengths 0
352
| processed_texts
sequencelengths 1
353
|
---|---|---|---|---|---|---|---|---|---|---|---|---|
0f3cb5bcb1c080f022f10940a9b134a11e6e0f2f |
## Dataset Information
| # Nodes | # Edges | # Features |
|:-------:|:---------:|:----------:|
| 169,343 | 1,166,243 | 128 |
Pre-processed as per the official codebase of https://arxiv.org/abs/2210.02016
## Citations
```
@article{ju2023multi,
title={Multi-task Self-supervised Graph Neural Networks Enable Stronger Task Generalization},
author={Ju, Mingxuan and Zhao, Tong and Wen, Qianlong and Yu, Wenhao and Shah, Neil and Ye, Yanfang and Zhang, Chuxu},
booktitle={International Conference on Learning Representations},
year={2023}
}
``` | SauravMaheshkar/pareto-ogbn-arxiv | [
"task_categories:graph-ml",
"size_categories:1K<n<10K",
"license:cc",
"arxiv:2210.02016",
"region:us"
] | 2024-02-14T17:54:01+00:00 | {"license": "cc", "size_categories": ["1K<n<10K"], "task_categories": ["graph-ml"]} | 2024-02-14T17:57:14+00:00 | [
"2210.02016"
] | [] | TAGS
#task_categories-graph-ml #size_categories-1K<n<10K #license-cc #arxiv-2210.02016 #region-us
| Dataset Information
-------------------
Pre-processed as per the official codebase of URL
s
| [] | [
"TAGS\n#task_categories-graph-ml #size_categories-1K<n<10K #license-cc #arxiv-2210.02016 #region-us \n"
] |
5305068180acf072cb63edce17bbb214377d6c1e |
# Dataset Card for Evaluation run of vicgalle/Miqu-6B-truthy
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [vicgalle/Miqu-6B-truthy](https://huggingface.co/vicgalle/Miqu-6B-truthy) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_vicgalle__Miqu-6B-truthy",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-14T17:59:06.649913](https://huggingface.co/datasets/open-llm-leaderboard/details_vicgalle__Miqu-6B-truthy/blob/main/results_2024-02-14T17-59-06.649913.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2687152793489649,
"acc_stderr": 0.03096620839159574,
"acc_norm": 0.2704268553310068,
"acc_norm_stderr": 0.0317929516881723,
"mc1": 0.2521419828641371,
"mc1_stderr": 0.015201522246299953,
"mc2": 0.5063110225747856,
"mc2_stderr": 0.016736591350337
},
"harness|arc:challenge|25": {
"acc": 0.22610921501706485,
"acc_stderr": 0.012224202097063293,
"acc_norm": 0.2764505119453925,
"acc_norm_stderr": 0.013069662474252428
},
"harness|hellaswag|10": {
"acc": 0.2574188408683529,
"acc_stderr": 0.004363185172047177,
"acc_norm": 0.26707827126070505,
"acc_norm_stderr": 0.004415293656599496
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.22962962962962963,
"acc_stderr": 0.03633384414073461,
"acc_norm": 0.22962962962962963,
"acc_norm_stderr": 0.03633384414073461
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3355263157894737,
"acc_stderr": 0.03842498559395268,
"acc_norm": 0.3355263157894737,
"acc_norm_stderr": 0.03842498559395268
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2981132075471698,
"acc_stderr": 0.028152837942493857,
"acc_norm": 0.2981132075471698,
"acc_norm_stderr": 0.028152837942493857
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2638888888888889,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.2638888888888889,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3352601156069364,
"acc_stderr": 0.03599586301247078,
"acc_norm": 0.3352601156069364,
"acc_norm_stderr": 0.03599586301247078
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082633,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082633
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.20851063829787234,
"acc_stderr": 0.026556982117838728,
"acc_norm": 0.20851063829787234,
"acc_norm_stderr": 0.026556982117838728
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813344,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813344
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2671957671957672,
"acc_stderr": 0.022789673145776564,
"acc_norm": 0.2671957671957672,
"acc_norm_stderr": 0.022789673145776564
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.04306241259127153,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.04306241259127153
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.18,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3161290322580645,
"acc_stderr": 0.02645087448904277,
"acc_norm": 0.3161290322580645,
"acc_norm_stderr": 0.02645087448904277
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.28078817733990147,
"acc_stderr": 0.03161856335358609,
"acc_norm": 0.28078817733990147,
"acc_norm_stderr": 0.03161856335358609
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.19,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.19,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2545454545454545,
"acc_stderr": 0.03401506715249039,
"acc_norm": 0.2545454545454545,
"acc_norm_stderr": 0.03401506715249039
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.35353535353535354,
"acc_stderr": 0.03406086723547153,
"acc_norm": 0.35353535353535354,
"acc_norm_stderr": 0.03406086723547153
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.36787564766839376,
"acc_stderr": 0.03480175668466036,
"acc_norm": 0.36787564766839376,
"acc_norm_stderr": 0.03480175668466036
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3641025641025641,
"acc_stderr": 0.02439667298509477,
"acc_norm": 0.3641025641025641,
"acc_norm_stderr": 0.02439667298509477
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.026842057873833706,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.026842057873833706
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3487394957983193,
"acc_stderr": 0.03095663632856655,
"acc_norm": 0.3487394957983193,
"acc_norm_stderr": 0.03095663632856655
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3486238532110092,
"acc_stderr": 0.020431254090714328,
"acc_norm": 0.3486238532110092,
"acc_norm_stderr": 0.020431254090714328
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.030587591351604246,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.030587591351604246
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.20253164556962025,
"acc_stderr": 0.026160568246601457,
"acc_norm": 0.20253164556962025,
"acc_norm_stderr": 0.026160568246601457
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.10762331838565023,
"acc_stderr": 0.020799400082879997,
"acc_norm": 0.10762331838565023,
"acc_norm_stderr": 0.020799400082879997
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2824427480916031,
"acc_stderr": 0.03948406125768361,
"acc_norm": 0.2824427480916031,
"acc_norm_stderr": 0.03948406125768361
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.14049586776859505,
"acc_stderr": 0.03172233426002161,
"acc_norm": 0.14049586776859505,
"acc_norm_stderr": 0.03172233426002161
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.21296296296296297,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.21296296296296297,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2331288343558282,
"acc_stderr": 0.033220157957767414,
"acc_norm": 0.2331288343558282,
"acc_norm_stderr": 0.033220157957767414
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.16071428571428573,
"acc_stderr": 0.03485946096475741,
"acc_norm": 0.16071428571428573,
"acc_norm_stderr": 0.03485946096475741
},
"harness|hendrycksTest-management|5": {
"acc": 0.3786407766990291,
"acc_stderr": 0.04802694698258972,
"acc_norm": 0.3786407766990291,
"acc_norm_stderr": 0.04802694698258972
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.19658119658119658,
"acc_stderr": 0.02603538609895129,
"acc_norm": 0.19658119658119658,
"acc_norm_stderr": 0.02603538609895129
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909281,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909281
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.20434227330779056,
"acc_stderr": 0.0144191239809319,
"acc_norm": 0.20434227330779056,
"acc_norm_stderr": 0.0144191239809319
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2138728323699422,
"acc_stderr": 0.022075709251757183,
"acc_norm": 0.2138728323699422,
"acc_norm_stderr": 0.022075709251757183
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27262569832402234,
"acc_stderr": 0.014893391735249588,
"acc_norm": 0.27262569832402234,
"acc_norm_stderr": 0.014893391735249588
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.02609016250427905,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.02609016250427905
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.24115755627009647,
"acc_stderr": 0.024296594034763426,
"acc_norm": 0.24115755627009647,
"acc_norm_stderr": 0.024296594034763426
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.22530864197530864,
"acc_stderr": 0.023246202647819746,
"acc_norm": 0.22530864197530864,
"acc_norm_stderr": 0.023246202647819746
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.24113475177304963,
"acc_stderr": 0.025518731049537762,
"acc_norm": 0.24113475177304963,
"acc_norm_stderr": 0.025518731049537762
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24445893089960888,
"acc_stderr": 0.010976425013113886,
"acc_norm": 0.24445893089960888,
"acc_norm_stderr": 0.010976425013113886
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4485294117647059,
"acc_stderr": 0.030211479609121593,
"acc_norm": 0.4485294117647059,
"acc_norm_stderr": 0.030211479609121593
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2173202614379085,
"acc_stderr": 0.01668482092914859,
"acc_norm": 0.2173202614379085,
"acc_norm_stderr": 0.01668482092914859
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.22727272727272727,
"acc_stderr": 0.04013964554072774,
"acc_norm": 0.22727272727272727,
"acc_norm_stderr": 0.04013964554072774
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4,
"acc_stderr": 0.031362502409358936,
"acc_norm": 0.4,
"acc_norm_stderr": 0.031362502409358936
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.26865671641791045,
"acc_stderr": 0.03134328358208954,
"acc_norm": 0.26865671641791045,
"acc_norm_stderr": 0.03134328358208954
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-virology|5": {
"acc": 0.1927710843373494,
"acc_stderr": 0.030709824050565274,
"acc_norm": 0.1927710843373494,
"acc_norm_stderr": 0.030709824050565274
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.17543859649122806,
"acc_stderr": 0.029170885500727654,
"acc_norm": 0.17543859649122806,
"acc_norm_stderr": 0.029170885500727654
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2521419828641371,
"mc1_stderr": 0.015201522246299953,
"mc2": 0.5063110225747856,
"mc2_stderr": 0.016736591350337
},
"harness|winogrande|5": {
"acc": 0.4964483030781373,
"acc_stderr": 0.01405213114691586
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_vicgalle__Miqu-6B-truthy | [
"region:us"
] | 2024-02-14T18:01:27+00:00 | {"pretty_name": "Evaluation run of vicgalle/Miqu-6B-truthy", "dataset_summary": "Dataset automatically created during the evaluation run of model [vicgalle/Miqu-6B-truthy](https://huggingface.co/vicgalle/Miqu-6B-truthy) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_vicgalle__Miqu-6B-truthy\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-14T17:59:06.649913](https://huggingface.co/datasets/open-llm-leaderboard/details_vicgalle__Miqu-6B-truthy/blob/main/results_2024-02-14T17-59-06.649913.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2687152793489649,\n \"acc_stderr\": 0.03096620839159574,\n \"acc_norm\": 0.2704268553310068,\n \"acc_norm_stderr\": 0.0317929516881723,\n \"mc1\": 0.2521419828641371,\n \"mc1_stderr\": 0.015201522246299953,\n \"mc2\": 0.5063110225747856,\n \"mc2_stderr\": 0.016736591350337\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.22610921501706485,\n \"acc_stderr\": 0.012224202097063293,\n \"acc_norm\": 0.2764505119453925,\n \"acc_norm_stderr\": 0.013069662474252428\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2574188408683529,\n \"acc_stderr\": 0.004363185172047177,\n \"acc_norm\": 0.26707827126070505,\n \"acc_norm_stderr\": 0.004415293656599496\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.22962962962962963,\n \"acc_stderr\": 0.03633384414073461,\n \"acc_norm\": 0.22962962962962963,\n \"acc_norm_stderr\": 0.03633384414073461\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.3355263157894737,\n \"acc_stderr\": 0.03842498559395268,\n \"acc_norm\": 0.3355263157894737,\n \"acc_norm_stderr\": 0.03842498559395268\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.2981132075471698,\n \"acc_stderr\": 0.028152837942493857,\n \"acc_norm\": 0.2981132075471698,\n \"acc_norm_stderr\": 0.028152837942493857\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2638888888888889,\n \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.2638888888888889,\n \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3352601156069364,\n \"acc_stderr\": 0.03599586301247078,\n \"acc_norm\": 0.3352601156069364,\n \"acc_norm_stderr\": 0.03599586301247078\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082633,\n \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082633\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536955,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536955\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.20851063829787234,\n \"acc_stderr\": 0.026556982117838728,\n \"acc_norm\": 0.20851063829787234,\n \"acc_norm_stderr\": 0.026556982117838728\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n \"acc_stderr\": 0.039994238792813344,\n \"acc_norm\": 0.23684210526315788,\n \"acc_norm_stderr\": 0.039994238792813344\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2671957671957672,\n \"acc_stderr\": 0.022789673145776564,\n \"acc_norm\": 0.2671957671957672,\n \"acc_norm_stderr\": 0.022789673145776564\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.36507936507936506,\n \"acc_stderr\": 0.04306241259127153,\n \"acc_norm\": 0.36507936507936506,\n \"acc_norm_stderr\": 0.04306241259127153\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.03861229196653694,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.03861229196653694\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3161290322580645,\n \"acc_stderr\": 0.02645087448904277,\n \"acc_norm\": 0.3161290322580645,\n \"acc_norm_stderr\": 0.02645087448904277\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.28078817733990147,\n \"acc_stderr\": 0.03161856335358609,\n \"acc_norm\": 0.28078817733990147,\n \"acc_norm_stderr\": 0.03161856335358609\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.19,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.2545454545454545,\n \"acc_stderr\": 0.03401506715249039,\n \"acc_norm\": 0.2545454545454545,\n \"acc_norm_stderr\": 0.03401506715249039\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.35353535353535354,\n \"acc_stderr\": 0.03406086723547153,\n \"acc_norm\": 0.35353535353535354,\n \"acc_norm_stderr\": 0.03406086723547153\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.36787564766839376,\n \"acc_stderr\": 0.03480175668466036,\n \"acc_norm\": 0.36787564766839376,\n \"acc_norm_stderr\": 0.03480175668466036\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.3641025641025641,\n \"acc_stderr\": 0.02439667298509477,\n \"acc_norm\": 0.3641025641025641,\n \"acc_norm_stderr\": 0.02439667298509477\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.26296296296296295,\n \"acc_stderr\": 0.026842057873833706,\n \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.026842057873833706\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.3487394957983193,\n \"acc_stderr\": 0.03095663632856655,\n \"acc_norm\": 0.3487394957983193,\n \"acc_norm_stderr\": 0.03095663632856655\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.3486238532110092,\n \"acc_stderr\": 0.020431254090714328,\n \"acc_norm\": 0.3486238532110092,\n \"acc_norm_stderr\": 0.020431254090714328\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.030587591351604246,\n \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.030587591351604246\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.20253164556962025,\n \"acc_stderr\": 0.026160568246601457,\n \"acc_norm\": 0.20253164556962025,\n \"acc_norm_stderr\": 0.026160568246601457\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.10762331838565023,\n \"acc_stderr\": 0.020799400082879997,\n \"acc_norm\": 0.10762331838565023,\n \"acc_norm_stderr\": 0.020799400082879997\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2824427480916031,\n \"acc_stderr\": 0.03948406125768361,\n \"acc_norm\": 0.2824427480916031,\n \"acc_norm_stderr\": 0.03948406125768361\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.14049586776859505,\n \"acc_stderr\": 0.03172233426002161,\n \"acc_norm\": 0.14049586776859505,\n \"acc_norm_stderr\": 0.03172233426002161\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.21296296296296297,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.21296296296296297,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.2331288343558282,\n \"acc_stderr\": 0.033220157957767414,\n \"acc_norm\": 0.2331288343558282,\n \"acc_norm_stderr\": 0.033220157957767414\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.16071428571428573,\n \"acc_stderr\": 0.03485946096475741,\n \"acc_norm\": 0.16071428571428573,\n \"acc_norm_stderr\": 0.03485946096475741\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.3786407766990291,\n \"acc_stderr\": 0.04802694698258972,\n \"acc_norm\": 0.3786407766990291,\n \"acc_norm_stderr\": 0.04802694698258972\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.19658119658119658,\n \"acc_stderr\": 0.02603538609895129,\n \"acc_norm\": 0.19658119658119658,\n \"acc_norm_stderr\": 0.02603538609895129\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909281,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909281\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.20434227330779056,\n \"acc_stderr\": 0.0144191239809319,\n \"acc_norm\": 0.20434227330779056,\n \"acc_norm_stderr\": 0.0144191239809319\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.2138728323699422,\n \"acc_stderr\": 0.022075709251757183,\n \"acc_norm\": 0.2138728323699422,\n \"acc_norm_stderr\": 0.022075709251757183\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27262569832402234,\n \"acc_stderr\": 0.014893391735249588,\n \"acc_norm\": 0.27262569832402234,\n \"acc_norm_stderr\": 0.014893391735249588\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.02609016250427905,\n \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.02609016250427905\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.24115755627009647,\n \"acc_stderr\": 0.024296594034763426,\n \"acc_norm\": 0.24115755627009647,\n \"acc_norm_stderr\": 0.024296594034763426\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.22530864197530864,\n \"acc_stderr\": 0.023246202647819746,\n \"acc_norm\": 0.22530864197530864,\n \"acc_norm_stderr\": 0.023246202647819746\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.24113475177304963,\n \"acc_stderr\": 0.025518731049537762,\n \"acc_norm\": 0.24113475177304963,\n \"acc_norm_stderr\": 0.025518731049537762\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24445893089960888,\n \"acc_stderr\": 0.010976425013113886,\n \"acc_norm\": 0.24445893089960888,\n \"acc_norm_stderr\": 0.010976425013113886\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.4485294117647059,\n \"acc_stderr\": 0.030211479609121593,\n \"acc_norm\": 0.4485294117647059,\n \"acc_norm_stderr\": 0.030211479609121593\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.2173202614379085,\n \"acc_stderr\": 0.01668482092914859,\n \"acc_norm\": 0.2173202614379085,\n \"acc_norm_stderr\": 0.01668482092914859\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.22727272727272727,\n \"acc_stderr\": 0.04013964554072774,\n \"acc_norm\": 0.22727272727272727,\n \"acc_norm_stderr\": 0.04013964554072774\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.031362502409358936,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.031362502409358936\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.26865671641791045,\n \"acc_stderr\": 0.03134328358208954,\n \"acc_norm\": 0.26865671641791045,\n \"acc_norm_stderr\": 0.03134328358208954\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.1927710843373494,\n \"acc_stderr\": 0.030709824050565274,\n \"acc_norm\": 0.1927710843373494,\n \"acc_norm_stderr\": 0.030709824050565274\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.17543859649122806,\n \"acc_stderr\": 0.029170885500727654,\n \"acc_norm\": 0.17543859649122806,\n \"acc_norm_stderr\": 0.029170885500727654\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2521419828641371,\n \"mc1_stderr\": 0.015201522246299953,\n \"mc2\": 0.5063110225747856,\n \"mc2_stderr\": 0.016736591350337\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.4964483030781373,\n \"acc_stderr\": 0.01405213114691586\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/vicgalle/Miqu-6B-truthy", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_14T17_59_06.649913", "path": ["**/details_harness|arc:challenge|25_2024-02-14T17-59-06.649913.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-14T17-59-06.649913.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_14T17_59_06.649913", "path": ["**/details_harness|gsm8k|5_2024-02-14T17-59-06.649913.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-14T17-59-06.649913.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_14T17_59_06.649913", "path": ["**/details_harness|hellaswag|10_2024-02-14T17-59-06.649913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-14T17-59-06.649913.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_14T17_59_06.649913", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T17-59-06.649913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T17-59-06.649913.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T17-59-06.649913.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_14T17_59_06.649913", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T17-59-06.649913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T17-59-06.649913.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_14T17_59_06.649913", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T17-59-06.649913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T17-59-06.649913.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_14T17_59_06.649913", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T17-59-06.649913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T17-59-06.649913.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_14T17_59_06.649913", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T17-59-06.649913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T17-59-06.649913.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_14T17_59_06.649913", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T17-59-06.649913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T17-59-06.649913.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_14T17_59_06.649913", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T17-59-06.649913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T17-59-06.649913.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_14T17_59_06.649913", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T17-59-06.649913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T17-59-06.649913.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_14T17_59_06.649913", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T17-59-06.649913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T17-59-06.649913.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_14T17_59_06.649913", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T17-59-06.649913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T17-59-06.649913.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_14T17_59_06.649913", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T17-59-06.649913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T17-59-06.649913.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_14T17_59_06.649913", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T17-59-06.649913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T17-59-06.649913.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_14T17_59_06.649913", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T17-59-06.649913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T17-59-06.649913.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_14T17_59_06.649913", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T17-59-06.649913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T17-59-06.649913.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_14T17_59_06.649913", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T17-59-06.649913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T17-59-06.649913.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_14T17_59_06.649913", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T17-59-06.649913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T17-59-06.649913.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_14T17_59_06.649913", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T17-59-06.649913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T17-59-06.649913.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_14T17_59_06.649913", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T17-59-06.649913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T17-59-06.649913.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_14T17_59_06.649913", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T17-59-06.649913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T17-59-06.649913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_14T17_59_06.649913", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T17-59-06.649913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T17-59-06.649913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_14T17_59_06.649913", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T17-59-06.649913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T17-59-06.649913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_14T17_59_06.649913", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T17-59-06.649913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T17-59-06.649913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_14T17_59_06.649913", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T17-59-06.649913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T17-59-06.649913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_14T17_59_06.649913", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T17-59-06.649913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T17-59-06.649913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_14T17_59_06.649913", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T17-59-06.649913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T17-59-06.649913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_14T17_59_06.649913", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T17-59-06.649913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T17-59-06.649913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_14T17_59_06.649913", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T17-59-06.649913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T17-59-06.649913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_14T17_59_06.649913", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T17-59-06.649913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T17-59-06.649913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_14T17_59_06.649913", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T17-59-06.649913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T17-59-06.649913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_14T17_59_06.649913", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T17-59-06.649913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T17-59-06.649913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_14T17_59_06.649913", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T17-59-06.649913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T17-59-06.649913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_14T17_59_06.649913", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T17-59-06.649913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T17-59-06.649913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_14T17_59_06.649913", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T17-59-06.649913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T17-59-06.649913.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_14T17_59_06.649913", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T17-59-06.649913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T17-59-06.649913.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_14T17_59_06.649913", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T17-59-06.649913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T17-59-06.649913.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_14T17_59_06.649913", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T17-59-06.649913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T17-59-06.649913.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_14T17_59_06.649913", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T17-59-06.649913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T17-59-06.649913.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_14T17_59_06.649913", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T17-59-06.649913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T17-59-06.649913.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_14T17_59_06.649913", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T17-59-06.649913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T17-59-06.649913.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_14T17_59_06.649913", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T17-59-06.649913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T17-59-06.649913.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_14T17_59_06.649913", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T17-59-06.649913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T17-59-06.649913.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_14T17_59_06.649913", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T17-59-06.649913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T17-59-06.649913.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_14T17_59_06.649913", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T17-59-06.649913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T17-59-06.649913.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_14T17_59_06.649913", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T17-59-06.649913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T17-59-06.649913.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_14T17_59_06.649913", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T17-59-06.649913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T17-59-06.649913.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_14T17_59_06.649913", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T17-59-06.649913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T17-59-06.649913.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_14T17_59_06.649913", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T17-59-06.649913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T17-59-06.649913.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_14T17_59_06.649913", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T17-59-06.649913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T17-59-06.649913.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_14T17_59_06.649913", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T17-59-06.649913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T17-59-06.649913.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_14T17_59_06.649913", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T17-59-06.649913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T17-59-06.649913.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_14T17_59_06.649913", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T17-59-06.649913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T17-59-06.649913.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_14T17_59_06.649913", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T17-59-06.649913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T17-59-06.649913.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_14T17_59_06.649913", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T17-59-06.649913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T17-59-06.649913.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_14T17_59_06.649913", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T17-59-06.649913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T17-59-06.649913.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_14T17_59_06.649913", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T17-59-06.649913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T17-59-06.649913.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_14T17_59_06.649913", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T17-59-06.649913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T17-59-06.649913.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_14T17_59_06.649913", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T17-59-06.649913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T17-59-06.649913.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_14T17_59_06.649913", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T17-59-06.649913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T17-59-06.649913.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_14T17_59_06.649913", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T17-59-06.649913.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T17-59-06.649913.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_14T17_59_06.649913", "path": ["**/details_harness|winogrande|5_2024-02-14T17-59-06.649913.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-14T17-59-06.649913.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_14T17_59_06.649913", "path": ["results_2024-02-14T17-59-06.649913.parquet"]}, {"split": "latest", "path": ["results_2024-02-14T17-59-06.649913.parquet"]}]}]} | 2024-02-14T18:01:52+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of vicgalle/Miqu-6B-truthy
Dataset automatically created during the evaluation run of model vicgalle/Miqu-6B-truthy on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-14T17:59:06.649913(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of vicgalle/Miqu-6B-truthy\n\n\n\nDataset automatically created during the evaluation run of model vicgalle/Miqu-6B-truthy on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T17:59:06.649913(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of vicgalle/Miqu-6B-truthy\n\n\n\nDataset automatically created during the evaluation run of model vicgalle/Miqu-6B-truthy on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T17:59:06.649913(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
75c0b77089f20819ff0e0b14ae1bd6c5245c9481 |
# Dataset Card for Evaluation run of FelixChao/Capricorn-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [FelixChao/Capricorn-7B](https://huggingface.co/FelixChao/Capricorn-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_FelixChao__Capricorn-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-14T18:16:40.340194](https://huggingface.co/datasets/open-llm-leaderboard/details_FelixChao__Capricorn-7B/blob/main/results_2024-02-14T18-16-40.340194.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6548216483813056,
"acc_stderr": 0.031984917713562884,
"acc_norm": 0.6542452376031305,
"acc_norm_stderr": 0.03265205346925597,
"mc1": 0.5826193390452876,
"mc1_stderr": 0.017262891063272164,
"mc2": 0.737632324586191,
"mc2_stderr": 0.014260696530287182
},
"harness|arc:challenge|25": {
"acc": 0.6979522184300341,
"acc_stderr": 0.013417519144716417,
"acc_norm": 0.7244027303754266,
"acc_norm_stderr": 0.01305716965576184
},
"harness|hellaswag|10": {
"acc": 0.7035451105357499,
"acc_stderr": 0.004557606227194305,
"acc_norm": 0.8840868352917746,
"acc_norm_stderr": 0.003194665266078602
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6592592592592592,
"acc_stderr": 0.04094376269996792,
"acc_norm": 0.6592592592592592,
"acc_norm_stderr": 0.04094376269996792
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695238,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695238
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.02794321998933714,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.02794321998933714
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7847222222222222,
"acc_stderr": 0.03437079344106135,
"acc_norm": 0.7847222222222222,
"acc_norm_stderr": 0.03437079344106135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.049135952012744975,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.049135952012744975
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.025379524910778408,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.025379524910778408
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677171,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677171
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723295,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723295
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9222797927461139,
"acc_stderr": 0.019321805557223144,
"acc_norm": 0.9222797927461139,
"acc_norm_stderr": 0.019321805557223144
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.676923076923077,
"acc_stderr": 0.023710888501970565,
"acc_norm": 0.676923076923077,
"acc_norm_stderr": 0.023710888501970565
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.030283995525884396,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.030283995525884396
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8403669724770643,
"acc_stderr": 0.015703498348461766,
"acc_norm": 0.8403669724770643,
"acc_norm_stderr": 0.015703498348461766
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49537037037037035,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.49537037037037035,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.025195658428931792,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.025195658428931792
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.0349814938546247,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.0349814938546247
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8275862068965517,
"acc_stderr": 0.013507943909371802,
"acc_norm": 0.8275862068965517,
"acc_norm_stderr": 0.013507943909371802
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.02394851290546836,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.02394851290546836
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.423463687150838,
"acc_stderr": 0.016525425898773503,
"acc_norm": 0.423463687150838,
"acc_norm_stderr": 0.016525425898773503
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.0256468630971379,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.0256468630971379
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.024659685185967284,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.024659685185967284
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5035460992907801,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.5035460992907801,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46870925684485004,
"acc_stderr": 0.012745204626083131,
"acc_norm": 0.46870925684485004,
"acc_norm_stderr": 0.012745204626083131
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.028245687391462927,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.028245687391462927
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6748366013071896,
"acc_stderr": 0.018950886770806315,
"acc_norm": 0.6748366013071896,
"acc_norm_stderr": 0.018950886770806315
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784593,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784593
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.034873508801977704,
"acc_norm": 0.86,
"acc_norm_stderr": 0.034873508801977704
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5826193390452876,
"mc1_stderr": 0.017262891063272164,
"mc2": 0.737632324586191,
"mc2_stderr": 0.014260696530287182
},
"harness|winogrande|5": {
"acc": 0.8326756116811366,
"acc_stderr": 0.010490608806828075
},
"harness|gsm8k|5": {
"acc": 0.7179681576952237,
"acc_stderr": 0.012394926584335695
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_FelixChao__Capricorn-7B | [
"region:us"
] | 2024-02-14T18:18:59+00:00 | {"pretty_name": "Evaluation run of FelixChao/Capricorn-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [FelixChao/Capricorn-7B](https://huggingface.co/FelixChao/Capricorn-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_FelixChao__Capricorn-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-14T18:16:40.340194](https://huggingface.co/datasets/open-llm-leaderboard/details_FelixChao__Capricorn-7B/blob/main/results_2024-02-14T18-16-40.340194.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6548216483813056,\n \"acc_stderr\": 0.031984917713562884,\n \"acc_norm\": 0.6542452376031305,\n \"acc_norm_stderr\": 0.03265205346925597,\n \"mc1\": 0.5826193390452876,\n \"mc1_stderr\": 0.017262891063272164,\n \"mc2\": 0.737632324586191,\n \"mc2_stderr\": 0.014260696530287182\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6979522184300341,\n \"acc_stderr\": 0.013417519144716417,\n \"acc_norm\": 0.7244027303754266,\n \"acc_norm_stderr\": 0.01305716965576184\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7035451105357499,\n \"acc_stderr\": 0.004557606227194305,\n \"acc_norm\": 0.8840868352917746,\n \"acc_norm_stderr\": 0.003194665266078602\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6592592592592592,\n \"acc_stderr\": 0.04094376269996792,\n \"acc_norm\": 0.6592592592592592,\n \"acc_norm_stderr\": 0.04094376269996792\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695238,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695238\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.02794321998933714,\n \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.02794321998933714\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7847222222222222,\n \"acc_stderr\": 0.03437079344106135,\n \"acc_norm\": 0.7847222222222222,\n \"acc_norm_stderr\": 0.03437079344106135\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.049135952012744975,\n \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.049135952012744975\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778408,\n \"acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778408\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n \"acc_stderr\": 0.04463112720677171,\n \"acc_norm\": 0.46825396825396826,\n \"acc_norm_stderr\": 0.04463112720677171\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n \"acc_stderr\": 0.023540799358723295,\n \"acc_norm\": 0.7806451612903226,\n \"acc_norm_stderr\": 0.023540799358723295\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.032568666616811015,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.032568666616811015\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9222797927461139,\n \"acc_stderr\": 0.019321805557223144,\n \"acc_norm\": 0.9222797927461139,\n \"acc_norm_stderr\": 0.019321805557223144\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.676923076923077,\n \"acc_stderr\": 0.023710888501970565,\n \"acc_norm\": 0.676923076923077,\n \"acc_norm_stderr\": 0.023710888501970565\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.030283995525884396,\n \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.030283995525884396\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8403669724770643,\n \"acc_stderr\": 0.015703498348461766,\n \"acc_norm\": 0.8403669724770643,\n \"acc_norm_stderr\": 0.015703498348461766\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931792,\n \"acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931792\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.0349814938546247,\n \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.0349814938546247\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8275862068965517,\n \"acc_stderr\": 0.013507943909371802,\n \"acc_norm\": 0.8275862068965517,\n \"acc_norm_stderr\": 0.013507943909371802\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.02394851290546836,\n \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.02394851290546836\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.423463687150838,\n \"acc_stderr\": 0.016525425898773503,\n \"acc_norm\": 0.423463687150838,\n \"acc_norm_stderr\": 0.016525425898773503\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.0256468630971379,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.0256468630971379\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.024659685185967284,\n \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.024659685185967284\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5035460992907801,\n \"acc_stderr\": 0.02982674915328092,\n \"acc_norm\": 0.5035460992907801,\n \"acc_norm_stderr\": 0.02982674915328092\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46870925684485004,\n \"acc_stderr\": 0.012745204626083131,\n \"acc_norm\": 0.46870925684485004,\n \"acc_norm_stderr\": 0.012745204626083131\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462927,\n \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462927\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6748366013071896,\n \"acc_stderr\": 0.018950886770806315,\n \"acc_norm\": 0.6748366013071896,\n \"acc_norm_stderr\": 0.018950886770806315\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.034873508801977704,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.034873508801977704\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5826193390452876,\n \"mc1_stderr\": 0.017262891063272164,\n \"mc2\": 0.737632324586191,\n \"mc2_stderr\": 0.014260696530287182\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8326756116811366,\n \"acc_stderr\": 0.010490608806828075\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7179681576952237,\n \"acc_stderr\": 0.012394926584335695\n }\n}\n```", "repo_url": "https://huggingface.co/FelixChao/Capricorn-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_14T18_16_40.340194", "path": ["**/details_harness|arc:challenge|25_2024-02-14T18-16-40.340194.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-14T18-16-40.340194.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_14T18_16_40.340194", "path": ["**/details_harness|gsm8k|5_2024-02-14T18-16-40.340194.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-14T18-16-40.340194.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_14T18_16_40.340194", "path": ["**/details_harness|hellaswag|10_2024-02-14T18-16-40.340194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-14T18-16-40.340194.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_14T18_16_40.340194", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T18-16-40.340194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T18-16-40.340194.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T18-16-40.340194.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_14T18_16_40.340194", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T18-16-40.340194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T18-16-40.340194.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_14T18_16_40.340194", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T18-16-40.340194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T18-16-40.340194.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_14T18_16_40.340194", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T18-16-40.340194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T18-16-40.340194.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_14T18_16_40.340194", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T18-16-40.340194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T18-16-40.340194.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_14T18_16_40.340194", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T18-16-40.340194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T18-16-40.340194.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_14T18_16_40.340194", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T18-16-40.340194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T18-16-40.340194.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_14T18_16_40.340194", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T18-16-40.340194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T18-16-40.340194.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_14T18_16_40.340194", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T18-16-40.340194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T18-16-40.340194.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_14T18_16_40.340194", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T18-16-40.340194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T18-16-40.340194.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_14T18_16_40.340194", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T18-16-40.340194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T18-16-40.340194.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_14T18_16_40.340194", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T18-16-40.340194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T18-16-40.340194.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_14T18_16_40.340194", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T18-16-40.340194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T18-16-40.340194.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_14T18_16_40.340194", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T18-16-40.340194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T18-16-40.340194.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_14T18_16_40.340194", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T18-16-40.340194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T18-16-40.340194.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_14T18_16_40.340194", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T18-16-40.340194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T18-16-40.340194.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_14T18_16_40.340194", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T18-16-40.340194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T18-16-40.340194.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_14T18_16_40.340194", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T18-16-40.340194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T18-16-40.340194.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_14T18_16_40.340194", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T18-16-40.340194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T18-16-40.340194.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_14T18_16_40.340194", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T18-16-40.340194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T18-16-40.340194.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_14T18_16_40.340194", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T18-16-40.340194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T18-16-40.340194.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_14T18_16_40.340194", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T18-16-40.340194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T18-16-40.340194.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_14T18_16_40.340194", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T18-16-40.340194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T18-16-40.340194.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_14T18_16_40.340194", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T18-16-40.340194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T18-16-40.340194.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_14T18_16_40.340194", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T18-16-40.340194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T18-16-40.340194.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_14T18_16_40.340194", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T18-16-40.340194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T18-16-40.340194.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_14T18_16_40.340194", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T18-16-40.340194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T18-16-40.340194.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_14T18_16_40.340194", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T18-16-40.340194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T18-16-40.340194.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_14T18_16_40.340194", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T18-16-40.340194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T18-16-40.340194.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_14T18_16_40.340194", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T18-16-40.340194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T18-16-40.340194.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_14T18_16_40.340194", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T18-16-40.340194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T18-16-40.340194.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_14T18_16_40.340194", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T18-16-40.340194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T18-16-40.340194.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_14T18_16_40.340194", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T18-16-40.340194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T18-16-40.340194.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_14T18_16_40.340194", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T18-16-40.340194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T18-16-40.340194.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_14T18_16_40.340194", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T18-16-40.340194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T18-16-40.340194.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_14T18_16_40.340194", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T18-16-40.340194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T18-16-40.340194.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_14T18_16_40.340194", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T18-16-40.340194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T18-16-40.340194.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_14T18_16_40.340194", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T18-16-40.340194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T18-16-40.340194.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_14T18_16_40.340194", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T18-16-40.340194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T18-16-40.340194.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_14T18_16_40.340194", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T18-16-40.340194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T18-16-40.340194.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_14T18_16_40.340194", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T18-16-40.340194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T18-16-40.340194.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_14T18_16_40.340194", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T18-16-40.340194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T18-16-40.340194.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_14T18_16_40.340194", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T18-16-40.340194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T18-16-40.340194.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_14T18_16_40.340194", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T18-16-40.340194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T18-16-40.340194.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_14T18_16_40.340194", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T18-16-40.340194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T18-16-40.340194.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_14T18_16_40.340194", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T18-16-40.340194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T18-16-40.340194.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_14T18_16_40.340194", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T18-16-40.340194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T18-16-40.340194.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_14T18_16_40.340194", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T18-16-40.340194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T18-16-40.340194.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_14T18_16_40.340194", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T18-16-40.340194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T18-16-40.340194.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_14T18_16_40.340194", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T18-16-40.340194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T18-16-40.340194.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_14T18_16_40.340194", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T18-16-40.340194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T18-16-40.340194.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_14T18_16_40.340194", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T18-16-40.340194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T18-16-40.340194.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_14T18_16_40.340194", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T18-16-40.340194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T18-16-40.340194.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_14T18_16_40.340194", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T18-16-40.340194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T18-16-40.340194.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_14T18_16_40.340194", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T18-16-40.340194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T18-16-40.340194.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_14T18_16_40.340194", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T18-16-40.340194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T18-16-40.340194.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_14T18_16_40.340194", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T18-16-40.340194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T18-16-40.340194.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_14T18_16_40.340194", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T18-16-40.340194.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T18-16-40.340194.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_14T18_16_40.340194", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T18-16-40.340194.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T18-16-40.340194.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_14T18_16_40.340194", "path": ["**/details_harness|winogrande|5_2024-02-14T18-16-40.340194.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-14T18-16-40.340194.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_14T18_16_40.340194", "path": ["results_2024-02-14T18-16-40.340194.parquet"]}, {"split": "latest", "path": ["results_2024-02-14T18-16-40.340194.parquet"]}]}]} | 2024-02-14T18:19:24+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of FelixChao/Capricorn-7B
Dataset automatically created during the evaluation run of model FelixChao/Capricorn-7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-14T18:16:40.340194(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of FelixChao/Capricorn-7B\n\n\n\nDataset automatically created during the evaluation run of model FelixChao/Capricorn-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T18:16:40.340194(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of FelixChao/Capricorn-7B\n\n\n\nDataset automatically created during the evaluation run of model FelixChao/Capricorn-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T18:16:40.340194(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
acc9d1d64be04ccf3a3ed99656f9b78da2e52618 | # Dataset Card for "stenotype-eval-dataset-func-type-stripped-v4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | franlucc/stenotype-eval-dataset-func-type-stripped-v4 | [
"region:us"
] | 2024-02-14T18:24:15+00:00 | {"dataset_info": {"features": [{"name": "hexsha", "dtype": "string"}, {"name": "size", "dtype": "int64"}, {"name": "ext", "dtype": "string"}, {"name": "lang", "dtype": "string"}, {"name": "max_stars_repo_path", "dtype": "string"}, {"name": "max_stars_repo_name", "dtype": "string"}, {"name": "max_stars_repo_head_hexsha", "dtype": "string"}, {"name": "max_stars_repo_licenses", "sequence": "string"}, {"name": "max_stars_count", "dtype": "float64"}, {"name": "max_stars_repo_stars_event_min_datetime", "dtype": "string"}, {"name": "max_stars_repo_stars_event_max_datetime", "dtype": "string"}, {"name": "max_issues_repo_path", "dtype": "string"}, {"name": "max_issues_repo_name", "dtype": "string"}, {"name": "max_issues_repo_head_hexsha", "dtype": "string"}, {"name": "max_issues_repo_licenses", "sequence": "string"}, {"name": "max_issues_count", "dtype": "float64"}, {"name": "max_issues_repo_issues_event_min_datetime", "dtype": "string"}, {"name": "max_issues_repo_issues_event_max_datetime", "dtype": "string"}, {"name": "max_forks_repo_path", "dtype": "string"}, {"name": "max_forks_repo_name", "dtype": "string"}, {"name": "max_forks_repo_head_hexsha", "dtype": "string"}, {"name": "max_forks_repo_licenses", "sequence": "string"}, {"name": "max_forks_count", "dtype": "float64"}, {"name": "max_forks_repo_forks_event_min_datetime", "dtype": "string"}, {"name": "max_forks_repo_forks_event_max_datetime", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "avg_line_length", "dtype": "float64"}, {"name": "max_line_length", "dtype": "int64"}, {"name": "alphanum_fraction", "dtype": "float64"}, {"name": "annotation_sites", "dtype": "int64"}, {"name": "type_definitions", "dtype": "int64"}, {"name": "loc", "dtype": "int64"}, {"name": "functions", "dtype": "int64"}, {"name": "loc_per_function", "dtype": "float64"}, {"name": "estimated_tokens", "dtype": "int64"}, {"name": "fim_program", "dtype": "string"}, {"name": "fim_type", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 58508751.59050279, "num_examples": 5313}], "download_size": 3644946, "dataset_size": 58508751.59050279}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-14T18:29:25+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "stenotype-eval-dataset-func-type-stripped-v4"
More Information needed | [
"# Dataset Card for \"stenotype-eval-dataset-func-type-stripped-v4\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"stenotype-eval-dataset-func-type-stripped-v4\"\n\nMore Information needed"
] |
8dab14d9d0478e16cafc17e9dac6b58fda494448 | # Dataset Card for "Aksharantar"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | eswardivi/Aksharantar | [
"region:us"
] | 2024-02-14T18:41:46+00:00 | {"dataset_info": [{"config_name": "hindi", "features": [{"name": "english_word", "dtype": "string"}, {"name": "unique_identifier", "dtype": "string"}, {"name": "source", "dtype": "string"}, {"name": "native_word", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 85958047, "num_examples": 1299155}, {"name": "validation", "num_bytes": 373313, "num_examples": 6357}, {"name": "test", "num_bytes": 592834, "num_examples": 10112}], "download_size": 42891617, "dataset_size": 86924194}, {"config_name": "kannada", "features": [{"name": "english_word", "dtype": "string"}, {"name": "unique_identifier", "dtype": "string"}, {"name": "source", "dtype": "string"}, {"name": "native_word", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 235592729, "num_examples": 2906728}, {"name": "validation", "num_bytes": 483566, "num_examples": 7025}, {"name": "test", "num_bytes": 787611, "num_examples": 11380}], "download_size": 111543544, "dataset_size": 236863906}, {"config_name": "malayalam", "features": [{"name": "english_word", "dtype": "string"}, {"name": "unique_identifier", "dtype": "string"}, {"name": "source", "dtype": "string"}, {"name": "native_word", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 365378647, "num_examples": 4100621}, {"name": "validation", "num_bytes": 543027, "num_examples": 7613}, {"name": "test", "num_bytes": 892533, "num_examples": 12451}], "download_size": 169278911, "dataset_size": 366814207}, {"config_name": "tamil", "features": [{"name": "english_word", "dtype": "string"}, {"name": "unique_identifier", "dtype": "string"}, {"name": "source", "dtype": "string"}, {"name": "native_word", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 275540689, "num_examples": 3230902}, {"name": "validation", "num_bytes": 611413, "num_examples": 8824}, {"name": "test", "num_bytes": 770850, "num_examples": 11499}], "download_size": 125196978, "dataset_size": 276922952}, {"config_name": "telugu", "features": [{"name": "english_word", "dtype": "string"}, {"name": "unique_identifier", "dtype": "string"}, {"name": "source", "dtype": "string"}, {"name": "native_word", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 190122662, "num_examples": 2429562}, {"name": "validation", "num_bytes": 507490, "num_examples": 7681}, {"name": "test", "num_bytes": 661473, "num_examples": 10260}], "download_size": 91663346, "dataset_size": 191291625}], "configs": [{"config_name": "hindi", "data_files": [{"split": "train", "path": "hindi/train-*"}, {"split": "validation", "path": "hindi/validation-*"}, {"split": "test", "path": "hindi/test-*"}]}, {"config_name": "kannada", "data_files": [{"split": "train", "path": "kannada/train-*"}, {"split": "validation", "path": "kannada/validation-*"}, {"split": "test", "path": "kannada/test-*"}]}, {"config_name": "malayalam", "data_files": [{"split": "train", "path": "malayalam/train-*"}, {"split": "validation", "path": "malayalam/validation-*"}, {"split": "test", "path": "malayalam/test-*"}]}, {"config_name": "tamil", "data_files": [{"split": "train", "path": "tamil/train-*"}, {"split": "validation", "path": "tamil/validation-*"}, {"split": "test", "path": "tamil/test-*"}]}, {"config_name": "telugu", "data_files": [{"split": "train", "path": "telugu/train-*"}, {"split": "validation", "path": "telugu/validation-*"}, {"split": "test", "path": "telugu/test-*"}]}]} | 2024-02-14T18:47:29+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "Aksharantar"
More Information needed | [
"# Dataset Card for \"Aksharantar\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"Aksharantar\"\n\nMore Information needed"
] |
d63aca5a789f11210e403f700dfe4a8265b2a1a5 | # Ukrainian News Summarization Dataset
# Based on [shamotskyi/ukr_pravda_2y](https://huggingface.co/datasets/shamotskyi/ukr_pravda_2y) News Dataset
This dataset contains news articles from the Ukrainian news website pravda.com.ua, summarized using the Claude Instant summarization model. The dataset is designed to support research in Ukrainian text summarization, news headline generation, and other NLP tasks.
## Dataset Structure
The dataset is structured as a CSV file with the following columns:
* **text:** The full text of the news article.
* **summary:** The Claude Instant-generated summary of the news article via AWS Bedrock API
## Usage Examples
**Fine-tuning Summarization Models:**
```python
from datasets import load_dataset
dataset = load_dataset("d0p3/ukr-pravda-news-summary")
# Fine-tune your summarization model on the 'original_text' and 'summary' columns
```
**Evaluating Summarization Quality:**
```python
from rouge import Rouge # Install the ROUGE metric library
rouge = Rouge()
scores = rouge.get_scores(model_generated_summaries, dataset["summary"])
```
## Creation Process
1. **Web Scraping:** [shamotskyi/ukr_pravda_2y](https://huggingface.co/datasets/shamotskyi/ukr_pravda_2y) dataset was used as a base.
2. **Summarization:** Each article's `ukr_text` was summarized using the Claude Instant model via AWS Bedrock API.
3. **Dataset Formatting:** The data was compiled into a CSV format.
## Licensing
This dataset is released under the [CC-BY-NC-4.0]. The rights to the original pravda.com.ua news articles remain with their respective authors.
## Ethical Considerations
* News article summarization comes with its own ethical concerns. Ensure this dataset is not used to generate misleading or deceptive content.
* Always consider the potential biases and limitations of Claude Instant as a summarization model.
## Contributors
* [d0p3]
## Expanding the Dataset
We welcome contributions! If you'd like to expand the dataset by adding more articles or summaries from other Ukrainian news sources! | d0p3/ukr-pravda-news-summary | [
"task_categories:summarization",
"size_categories:10K<n<100K",
"language:uk",
"license:cc-by-nc-4.0",
"region:us"
] | 2024-02-14T19:03:05+00:00 | {"language": ["uk"], "license": "cc-by-nc-4.0", "size_categories": ["10K<n<100K"], "task_categories": ["summarization"], "pretty_name": "Ukr Pravda News Summarized v1.0"} | 2024-02-15T07:46:16+00:00 | [] | [
"uk"
] | TAGS
#task_categories-summarization #size_categories-10K<n<100K #language-Ukrainian #license-cc-by-nc-4.0 #region-us
| # Ukrainian News Summarization Dataset
# Based on shamotskyi/ukr_pravda_2y News Dataset
This dataset contains news articles from the Ukrainian news website URL, summarized using the Claude Instant summarization model. The dataset is designed to support research in Ukrainian text summarization, news headline generation, and other NLP tasks.
## Dataset Structure
The dataset is structured as a CSV file with the following columns:
* text: The full text of the news article.
* summary: The Claude Instant-generated summary of the news article via AWS Bedrock API
## Usage Examples
Fine-tuning Summarization Models:
Evaluating Summarization Quality:
## Creation Process
1. Web Scraping: shamotskyi/ukr_pravda_2y dataset was used as a base.
2. Summarization: Each article's 'ukr_text' was summarized using the Claude Instant model via AWS Bedrock API.
3. Dataset Formatting: The data was compiled into a CSV format.
## Licensing
This dataset is released under the [CC-BY-NC-4.0]. The rights to the original URL news articles remain with their respective authors.
## Ethical Considerations
* News article summarization comes with its own ethical concerns. Ensure this dataset is not used to generate misleading or deceptive content.
* Always consider the potential biases and limitations of Claude Instant as a summarization model.
## Contributors
* [d0p3]
## Expanding the Dataset
We welcome contributions! If you'd like to expand the dataset by adding more articles or summaries from other Ukrainian news sources! | [
"# Ukrainian News Summarization Dataset",
"# Based on shamotskyi/ukr_pravda_2y News Dataset\n\nThis dataset contains news articles from the Ukrainian news website URL, summarized using the Claude Instant summarization model. The dataset is designed to support research in Ukrainian text summarization, news headline generation, and other NLP tasks.",
"## Dataset Structure\n\nThe dataset is structured as a CSV file with the following columns:\n\n* text: The full text of the news article.\n* summary: The Claude Instant-generated summary of the news article via AWS Bedrock API",
"## Usage Examples\n\nFine-tuning Summarization Models:\n\n\n\nEvaluating Summarization Quality:",
"## Creation Process\n\n1. Web Scraping: shamotskyi/ukr_pravda_2y dataset was used as a base.\n2. Summarization: Each article's 'ukr_text' was summarized using the Claude Instant model via AWS Bedrock API.\n3. Dataset Formatting: The data was compiled into a CSV format.",
"## Licensing\n\nThis dataset is released under the [CC-BY-NC-4.0]. The rights to the original URL news articles remain with their respective authors.",
"## Ethical Considerations\n\n* News article summarization comes with its own ethical concerns. Ensure this dataset is not used to generate misleading or deceptive content.\n* Always consider the potential biases and limitations of Claude Instant as a summarization model.",
"## Contributors\n\n* [d0p3]",
"## Expanding the Dataset\n\nWe welcome contributions! If you'd like to expand the dataset by adding more articles or summaries from other Ukrainian news sources!"
] | [
"TAGS\n#task_categories-summarization #size_categories-10K<n<100K #language-Ukrainian #license-cc-by-nc-4.0 #region-us \n",
"# Ukrainian News Summarization Dataset",
"# Based on shamotskyi/ukr_pravda_2y News Dataset\n\nThis dataset contains news articles from the Ukrainian news website URL, summarized using the Claude Instant summarization model. The dataset is designed to support research in Ukrainian text summarization, news headline generation, and other NLP tasks.",
"## Dataset Structure\n\nThe dataset is structured as a CSV file with the following columns:\n\n* text: The full text of the news article.\n* summary: The Claude Instant-generated summary of the news article via AWS Bedrock API",
"## Usage Examples\n\nFine-tuning Summarization Models:\n\n\n\nEvaluating Summarization Quality:",
"## Creation Process\n\n1. Web Scraping: shamotskyi/ukr_pravda_2y dataset was used as a base.\n2. Summarization: Each article's 'ukr_text' was summarized using the Claude Instant model via AWS Bedrock API.\n3. Dataset Formatting: The data was compiled into a CSV format.",
"## Licensing\n\nThis dataset is released under the [CC-BY-NC-4.0]. The rights to the original URL news articles remain with their respective authors.",
"## Ethical Considerations\n\n* News article summarization comes with its own ethical concerns. Ensure this dataset is not used to generate misleading or deceptive content.\n* Always consider the potential biases and limitations of Claude Instant as a summarization model.",
"## Contributors\n\n* [d0p3]",
"## Expanding the Dataset\n\nWe welcome contributions! If you'd like to expand the dataset by adding more articles or summaries from other Ukrainian news sources!"
] |
c492aaa3d901752b4630a09be895ba31ca12d08d | Mirror for https://github.com/VITA-Group/FSGS | mileleap/FSGS | [
"region:us"
] | 2024-02-14T19:05:26+00:00 | {} | 2024-02-14T19:37:54+00:00 | [] | [] | TAGS
#region-us
| Mirror for URL | [] | [
"TAGS\n#region-us \n"
] |
6241dc4d84572609bd6e132cf93ca8d4cb66369d |
Dataset from [RAVEN: A Dataset for Relational and Analogical Visual rEasoNing
](https://arxiv.org/abs/1903.02741).
Homepage: https://github.com/WellyZhang/RAVEN | HuggingFaceM4/RAVEN | [
"arxiv:1903.02741",
"region:us"
] | 2024-02-14T19:08:23+00:00 | {"dataset_info": [{"config_name": "center_single", "features": [{"name": "panels", "list": "image"}, {"name": "choices", "list": "image"}, {"name": "structure", "dtype": {"array2_d": {"shape": [1, 8], "dtype": "string"}}}, {"name": "meta_matrix", "dtype": {"array2_d": {"shape": [8, 9], "dtype": "uint8"}}}, {"name": "meta_target", "dtype": {"array2_d": {"shape": [1, 9], "dtype": "uint8"}}}, {"name": "meta_structure", "dtype": {"array2_d": {"shape": [1, 21], "dtype": "uint8"}}}, {"name": "target", "dtype": "uint8"}, {"name": "id", "dtype": "int32"}, {"name": "metadata", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 219804916.0, "num_examples": 6000}, {"name": "validation", "num_bytes": 73611915.0, "num_examples": 2000}, {"name": "test", "num_bytes": 73880331.0, "num_examples": 2000}], "download_size": 213062764, "dataset_size": 367297162.0}, {"config_name": "distribute_four", "features": [{"name": "panels", "list": "image"}, {"name": "choices", "list": "image"}, {"name": "structure", "dtype": {"array2_d": {"shape": [1, 8], "dtype": "string"}}}, {"name": "meta_matrix", "dtype": {"array2_d": {"shape": [8, 9], "dtype": "uint8"}}}, {"name": "meta_target", "dtype": {"array2_d": {"shape": [1, 9], "dtype": "uint8"}}}, {"name": "meta_structure", "dtype": {"array2_d": {"shape": [1, 21], "dtype": "uint8"}}}, {"name": "target", "dtype": "uint8"}, {"name": "id", "dtype": "int32"}, {"name": "metadata", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 255341749.0, "num_examples": 6000}, {"name": "validation", "num_bytes": 85646249.0, "num_examples": 2000}, {"name": "test", "num_bytes": 85553961.0, "num_examples": 2000}], "download_size": 272347298, "dataset_size": 426541959.0}, {"config_name": "distribute_nine", "features": [{"name": "panels", "list": "image"}, {"name": "choices", "list": "image"}, {"name": "structure", "dtype": {"array2_d": {"shape": [1, 8], "dtype": "string"}}}, {"name": "meta_matrix", "dtype": {"array2_d": {"shape": [8, 9], "dtype": "uint8"}}}, {"name": "meta_target", "dtype": {"array2_d": {"shape": [1, 9], "dtype": "uint8"}}}, {"name": "meta_structure", "dtype": {"array2_d": {"shape": [1, 21], "dtype": "uint8"}}}, {"name": "target", "dtype": "uint8"}, {"name": "id", "dtype": "int32"}, {"name": "metadata", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 349108622.0, "num_examples": 6000}, {"name": "validation", "num_bytes": 118133610.0, "num_examples": 2000}, {"name": "test", "num_bytes": 116742247.0, "num_examples": 2000}], "download_size": 352013405, "dataset_size": 583984479.0}, {"config_name": "in_center_single_out_center_single", "features": [{"name": "panels", "list": "image"}, {"name": "choices", "list": "image"}, {"name": "structure", "dtype": {"array2_d": {"shape": [1, 8], "dtype": "string"}}}, {"name": "meta_matrix", "dtype": {"array2_d": {"shape": [8, 9], "dtype": "uint8"}}}, {"name": "meta_target", "dtype": {"array2_d": {"shape": [1, 9], "dtype": "uint8"}}}, {"name": "meta_structure", "dtype": {"array2_d": {"shape": [1, 21], "dtype": "uint8"}}}, {"name": "target", "dtype": "uint8"}, {"name": "id", "dtype": "int32"}, {"name": "metadata", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 342483526.0, "num_examples": 6000}, {"name": "validation", "num_bytes": 114191360.0, "num_examples": 2000}, {"name": "test", "num_bytes": 114324613.0, "num_examples": 2000}], "download_size": 354453412, "dataset_size": 570999499.0}, {"config_name": "in_distribute_four_out_center_single", "features": [{"name": "panels", "list": "image"}, {"name": "choices", "list": "image"}, {"name": "structure", "dtype": {"array2_d": {"shape": [1, 8], "dtype": "string"}}}, {"name": "meta_matrix", "dtype": {"array2_d": {"shape": [8, 9], "dtype": "uint8"}}}, {"name": "meta_target", "dtype": {"array2_d": {"shape": [1, 9], "dtype": "uint8"}}}, {"name": "meta_structure", "dtype": {"array2_d": {"shape": [1, 21], "dtype": "uint8"}}}, {"name": "target", "dtype": "uint8"}, {"name": "id", "dtype": "int32"}, {"name": "metadata", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 374132580.0, "num_examples": 6000}, {"name": "validation", "num_bytes": 125685012.0, "num_examples": 2000}, {"name": "test", "num_bytes": 124616415.0, "num_examples": 2000}], "download_size": 371649861, "dataset_size": 624434007.0}, {"config_name": "left_center_single_right_center_single", "features": [{"name": "panels", "list": "image"}, {"name": "choices", "list": "image"}, {"name": "structure", "dtype": {"array2_d": {"shape": [1, 8], "dtype": "string"}}}, {"name": "meta_matrix", "dtype": {"array2_d": {"shape": [8, 9], "dtype": "uint8"}}}, {"name": "meta_target", "dtype": {"array2_d": {"shape": [1, 9], "dtype": "uint8"}}}, {"name": "meta_structure", "dtype": {"array2_d": {"shape": [1, 21], "dtype": "uint8"}}}, {"name": "target", "dtype": "uint8"}, {"name": "id", "dtype": "int32"}, {"name": "metadata", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 252973967.0, "num_examples": 6000}, {"name": "validation", "num_bytes": 85214590.0, "num_examples": 2000}, {"name": "test", "num_bytes": 84771828.0, "num_examples": 2000}], "download_size": 238890851, "dataset_size": 422960385.0}, {"config_name": "up_center_single_down_center_single", "features": [{"name": "panels", "list": "image"}, {"name": "choices", "list": "image"}, {"name": "structure", "dtype": {"array2_d": {"shape": [1, 8], "dtype": "string"}}}, {"name": "meta_matrix", "dtype": {"array2_d": {"shape": [8, 9], "dtype": "uint8"}}}, {"name": "meta_target", "dtype": {"array2_d": {"shape": [1, 9], "dtype": "uint8"}}}, {"name": "meta_structure", "dtype": {"array2_d": {"shape": [1, 21], "dtype": "uint8"}}}, {"name": "target", "dtype": "uint8"}, {"name": "id", "dtype": "int32"}, {"name": "metadata", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 249153679.0, "num_examples": 6000}, {"name": "validation", "num_bytes": 83628250.0, "num_examples": 2000}, {"name": "test", "num_bytes": 83734170.0, "num_examples": 2000}], "download_size": 246945860, "dataset_size": 416516099.0}], "configs": [{"config_name": "center_single", "data_files": [{"split": "train", "path": "center_single/train-*"}, {"split": "validation", "path": "center_single/validation-*"}, {"split": "test", "path": "center_single/test-*"}]}, {"config_name": "distribute_four", "data_files": [{"split": "train", "path": "distribute_four/train-*"}, {"split": "validation", "path": "distribute_four/validation-*"}, {"split": "test", "path": "distribute_four/test-*"}]}, {"config_name": "distribute_nine", "data_files": [{"split": "train", "path": "distribute_nine/train-*"}, {"split": "validation", "path": "distribute_nine/validation-*"}, {"split": "test", "path": "distribute_nine/test-*"}]}, {"config_name": "in_center_single_out_center_single", "data_files": [{"split": "train", "path": "in_center_single_out_center_single/train-*"}, {"split": "validation", "path": "in_center_single_out_center_single/validation-*"}, {"split": "test", "path": "in_center_single_out_center_single/test-*"}]}, {"config_name": "in_distribute_four_out_center_single", "data_files": [{"split": "train", "path": "in_distribute_four_out_center_single/train-*"}, {"split": "validation", "path": "in_distribute_four_out_center_single/validation-*"}, {"split": "test", "path": "in_distribute_four_out_center_single/test-*"}]}, {"config_name": "left_center_single_right_center_single", "data_files": [{"split": "train", "path": "left_center_single_right_center_single/train-*"}, {"split": "validation", "path": "left_center_single_right_center_single/validation-*"}, {"split": "test", "path": "left_center_single_right_center_single/test-*"}]}, {"config_name": "up_center_single_down_center_single", "data_files": [{"split": "train", "path": "up_center_single_down_center_single/train-*"}, {"split": "validation", "path": "up_center_single_down_center_single/validation-*"}, {"split": "test", "path": "up_center_single_down_center_single/test-*"}]}]} | 2024-02-14T19:52:26+00:00 | [
"1903.02741"
] | [] | TAGS
#arxiv-1903.02741 #region-us
|
Dataset from RAVEN: A Dataset for Relational and Analogical Visual rEasoNing
.
Homepage: URL | [] | [
"TAGS\n#arxiv-1903.02741 #region-us \n"
] |
baafc35525f65b607b9ec5abf97e1a16605d6ae2 | # GPN-MSA predictions for all possible SNPs in the human genome (~9 billion)
For more information check out our [paper](https://doi.org/10.1101/2023.10.10.561776) and [repository](https://github.com/songlab-cal/gpn).
## Querying specific variants or genes
- Install the latest [tabix](https://www.htslib.org/doc/tabix.html):
In your current conda environment (might be slow):
```bash
conda install -c bioconda -c conda-forge htslib=1.18
```
or in a new conda environment:
```bash
conda create -n tabix -c bioconda -c conda-forge htslib=1.18
conda activate tabix
```
- Query a specific region (e.g. BRCA1), from the remote file:
```bash
tabix https://huggingface.co/datasets/songlab/gpn-msa-hg38-scores/resolve/main/scores.tsv.bgz 17:43,044,295-43,125,364
```
The output has the following columns:
| chrom | pos | ref | alt | GPN-MSA score |
and would start like this:
```tsv
17 43044295 T A -1.60
17 43044295 T C -1.47
17 43044295 T G -1.61
17 43044296 G A -1.12
17 43044296 G C -1.46
17 43044296 G T -1.45
17 43044297 G A -1.45
17 43044297 G C -1.55
17 43044297 G T -1.54
17 43044298 A C -1.64
```
- If you want to do many queries you might want to first download the files locally
```bash
wget https://huggingface.co/datasets/songlab/gpn-msa-hg38-scores/resolve/main/scores.tsv.bgz
wget https://huggingface.co/datasets/songlab/gpn-msa-hg38-scores/resolve/main/scores.tsv.bgz.tbi
```
and then score:
```bash
tabix scores.tsv.bgz 17:43,044,295-43,125,364
``` | songlab/gpn-msa-hg38-scores | [
"license:mit",
"dna",
"variant-effect-prediction",
"biology",
"genomics",
"region:us"
] | 2024-02-14T19:13:59+00:00 | {"license": "mit", "tags": ["dna", "variant-effect-prediction", "biology", "genomics"]} | 2024-02-14T21:59:55+00:00 | [] | [] | TAGS
#license-mit #dna #variant-effect-prediction #biology #genomics #region-us
| # GPN-MSA predictions for all possible SNPs in the human genome (~9 billion)
For more information check out our paper and repository.
## Querying specific variants or genes
- Install the latest tabix:
In your current conda environment (might be slow):
or in a new conda environment:
- Query a specific region (e.g. BRCA1), from the remote file:
The output has the following columns:
| chrom | pos | ref | alt | GPN-MSA score |
and would start like this:
- If you want to do many queries you might want to first download the files locally
and then score:
| [
"# GPN-MSA predictions for all possible SNPs in the human genome (~9 billion)\nFor more information check out our paper and repository.",
"## Querying specific variants or genes\n\n- Install the latest tabix: \n In your current conda environment (might be slow):\n \n or in a new conda environment:\n \n- Query a specific region (e.g. BRCA1), from the remote file: \n \n The output has the following columns: \n | chrom | pos | ref | alt | GPN-MSA score | \n and would start like this: \n \n- If you want to do many queries you might want to first download the files locally\n \n and then score:"
] | [
"TAGS\n#license-mit #dna #variant-effect-prediction #biology #genomics #region-us \n",
"# GPN-MSA predictions for all possible SNPs in the human genome (~9 billion)\nFor more information check out our paper and repository.",
"## Querying specific variants or genes\n\n- Install the latest tabix: \n In your current conda environment (might be slow):\n \n or in a new conda environment:\n \n- Query a specific region (e.g. BRCA1), from the remote file: \n \n The output has the following columns: \n | chrom | pos | ref | alt | GPN-MSA score | \n and would start like this: \n \n- If you want to do many queries you might want to first download the files locally\n \n and then score:"
] |
cb821d6c4b34e87a1fc0dd39e346ae7ab8296b9d |
# Dataset Card for Evaluation run of bigcode/starcoderbase
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [bigcode/starcoderbase](https://huggingface.co/bigcode/starcoderbase) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_bigcode__starcoderbase",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-14T19:33:07.504814](https://huggingface.co/datasets/open-llm-leaderboard/details_bigcode__starcoderbase/blob/main/results_2024-02-14T19-33-07.504814.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.32135391894403226,
"acc_stderr": 0.033015926396633116,
"acc_norm": 0.32341972377293626,
"acc_norm_stderr": 0.03377999870671841,
"mc1": 0.2521419828641371,
"mc1_stderr": 0.01520152224629997,
"mc2": 0.400215713503952,
"mc2_stderr": 0.014978959258933158
},
"harness|arc:challenge|25": {
"acc": 0.2815699658703072,
"acc_stderr": 0.013143376735009022,
"acc_norm": 0.302901023890785,
"acc_norm_stderr": 0.013428241573185349
},
"harness|hellaswag|10": {
"acc": 0.377414857598088,
"acc_stderr": 0.0048374934398743045,
"acc_norm": 0.47211710814578767,
"acc_norm_stderr": 0.004982016702445962
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.04605661864718381,
"acc_norm": 0.3,
"acc_norm_stderr": 0.04605661864718381
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.042446332383532286,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.042446332383532286
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3026315789473684,
"acc_stderr": 0.03738520676119667,
"acc_norm": 0.3026315789473684,
"acc_norm_stderr": 0.03738520676119667
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.33962264150943394,
"acc_stderr": 0.029146904747798335,
"acc_norm": 0.33962264150943394,
"acc_norm_stderr": 0.029146904747798335
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2638888888888889,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.2638888888888889,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036622,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036622
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2023121387283237,
"acc_stderr": 0.03063114553919882,
"acc_norm": 0.2023121387283237,
"acc_norm_stderr": 0.03063114553919882
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.03950581861179961,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.03950581861179961
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2765957446808511,
"acc_stderr": 0.029241883869628806,
"acc_norm": 0.2765957446808511,
"acc_norm_stderr": 0.029241883869628806
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.35964912280701755,
"acc_stderr": 0.04514496132873633,
"acc_norm": 0.35964912280701755,
"acc_norm_stderr": 0.04514496132873633
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.42758620689655175,
"acc_stderr": 0.041227371113703316,
"acc_norm": 0.42758620689655175,
"acc_norm_stderr": 0.041227371113703316
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.02256989707491841,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.02256989707491841
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.24603174603174602,
"acc_stderr": 0.03852273364924315,
"acc_norm": 0.24603174603174602,
"acc_norm_stderr": 0.03852273364924315
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.33548387096774196,
"acc_stderr": 0.02686020644472435,
"acc_norm": 0.33548387096774196,
"acc_norm_stderr": 0.02686020644472435
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2660098522167488,
"acc_stderr": 0.031089826002937523,
"acc_norm": 0.2660098522167488,
"acc_norm_stderr": 0.031089826002937523
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.30303030303030304,
"acc_stderr": 0.035886248000917075,
"acc_norm": 0.30303030303030304,
"acc_norm_stderr": 0.035886248000917075
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2828282828282828,
"acc_stderr": 0.03208779558786751,
"acc_norm": 0.2828282828282828,
"acc_norm_stderr": 0.03208779558786751
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.3005181347150259,
"acc_stderr": 0.0330881859441575,
"acc_norm": 0.3005181347150259,
"acc_norm_stderr": 0.0330881859441575
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.28974358974358977,
"acc_stderr": 0.02300062824368797,
"acc_norm": 0.28974358974358977,
"acc_norm_stderr": 0.02300062824368797
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.29259259259259257,
"acc_stderr": 0.027738969632176088,
"acc_norm": 0.29259259259259257,
"acc_norm_stderr": 0.027738969632176088
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.31092436974789917,
"acc_stderr": 0.030066761582977934,
"acc_norm": 0.31092436974789917,
"acc_norm_stderr": 0.030066761582977934
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.26605504587155965,
"acc_stderr": 0.018946022322225597,
"acc_norm": 0.26605504587155965,
"acc_norm_stderr": 0.018946022322225597
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.02915752218460559,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.02915752218460559
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.03132179803083291,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.03132179803083291
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.29957805907172996,
"acc_stderr": 0.029818024749753095,
"acc_norm": 0.29957805907172996,
"acc_norm_stderr": 0.029818024749753095
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3991031390134529,
"acc_stderr": 0.03286745312567961,
"acc_norm": 0.3991031390134529,
"acc_norm_stderr": 0.03286745312567961
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.33587786259541985,
"acc_stderr": 0.041423137719966634,
"acc_norm": 0.33587786259541985,
"acc_norm_stderr": 0.041423137719966634
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.4132231404958678,
"acc_stderr": 0.04495087843548408,
"acc_norm": 0.4132231404958678,
"acc_norm_stderr": 0.04495087843548408
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.04414343666854933,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.04414343666854933
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.26993865030674846,
"acc_stderr": 0.03487825168497892,
"acc_norm": 0.26993865030674846,
"acc_norm_stderr": 0.03487825168497892
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.32142857142857145,
"acc_stderr": 0.0443280405529152,
"acc_norm": 0.32142857142857145,
"acc_norm_stderr": 0.0443280405529152
},
"harness|hendrycksTest-management|5": {
"acc": 0.33980582524271846,
"acc_stderr": 0.046897659372781335,
"acc_norm": 0.33980582524271846,
"acc_norm_stderr": 0.046897659372781335
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.4358974358974359,
"acc_stderr": 0.03248577511578401,
"acc_norm": 0.4358974358974359,
"acc_norm_stderr": 0.03248577511578401
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.37292464878671777,
"acc_stderr": 0.01729286826945393,
"acc_norm": 0.37292464878671777,
"acc_norm_stderr": 0.01729286826945393
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.3959537572254335,
"acc_stderr": 0.02632981334194624,
"acc_norm": 0.3959537572254335,
"acc_norm_stderr": 0.02632981334194624
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.35947712418300654,
"acc_stderr": 0.02747596991066095,
"acc_norm": 0.35947712418300654,
"acc_norm_stderr": 0.02747596991066095
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.40514469453376206,
"acc_stderr": 0.02788238379132595,
"acc_norm": 0.40514469453376206,
"acc_norm_stderr": 0.02788238379132595
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.33641975308641975,
"acc_stderr": 0.026289734945952926,
"acc_norm": 0.33641975308641975,
"acc_norm_stderr": 0.026289734945952926
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.29432624113475175,
"acc_stderr": 0.027187127011503786,
"acc_norm": 0.29432624113475175,
"acc_norm_stderr": 0.027187127011503786
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2842242503259452,
"acc_stderr": 0.01151988059651607,
"acc_norm": 0.2842242503259452,
"acc_norm_stderr": 0.01151988059651607
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.02352924218519311,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.02352924218519311
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.3088235294117647,
"acc_stderr": 0.018690850273595284,
"acc_norm": 0.3088235294117647,
"acc_norm_stderr": 0.018690850273595284
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.36363636363636365,
"acc_stderr": 0.04607582090719976,
"acc_norm": 0.36363636363636365,
"acc_norm_stderr": 0.04607582090719976
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3142857142857143,
"acc_stderr": 0.02971932942241749,
"acc_norm": 0.3142857142857143,
"acc_norm_stderr": 0.02971932942241749
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.3880597014925373,
"acc_stderr": 0.034457899643627506,
"acc_norm": 0.3880597014925373,
"acc_norm_stderr": 0.034457899643627506
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3313253012048193,
"acc_stderr": 0.03664314777288086,
"acc_norm": 0.3313253012048193,
"acc_norm_stderr": 0.03664314777288086
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.03615507630310934,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.03615507630310934
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2521419828641371,
"mc1_stderr": 0.01520152224629997,
"mc2": 0.400215713503952,
"mc2_stderr": 0.014978959258933158
},
"harness|winogrande|5": {
"acc": 0.5580110497237569,
"acc_stderr": 0.013957584079108994
},
"harness|gsm8k|5": {
"acc": 0.07884761182714177,
"acc_stderr": 0.007423390519873237
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_bigcode__starcoderbase | [
"region:us"
] | 2024-02-14T19:35:36+00:00 | {"pretty_name": "Evaluation run of bigcode/starcoderbase", "dataset_summary": "Dataset automatically created during the evaluation run of model [bigcode/starcoderbase](https://huggingface.co/bigcode/starcoderbase) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_bigcode__starcoderbase\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-14T19:33:07.504814](https://huggingface.co/datasets/open-llm-leaderboard/details_bigcode__starcoderbase/blob/main/results_2024-02-14T19-33-07.504814.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.32135391894403226,\n \"acc_stderr\": 0.033015926396633116,\n \"acc_norm\": 0.32341972377293626,\n \"acc_norm_stderr\": 0.03377999870671841,\n \"mc1\": 0.2521419828641371,\n \"mc1_stderr\": 0.01520152224629997,\n \"mc2\": 0.400215713503952,\n \"mc2_stderr\": 0.014978959258933158\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.2815699658703072,\n \"acc_stderr\": 0.013143376735009022,\n \"acc_norm\": 0.302901023890785,\n \"acc_norm_stderr\": 0.013428241573185349\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.377414857598088,\n \"acc_stderr\": 0.0048374934398743045,\n \"acc_norm\": 0.47211710814578767,\n \"acc_norm_stderr\": 0.004982016702445962\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.04605661864718381,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.04605661864718381\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4074074074074074,\n \"acc_stderr\": 0.042446332383532286,\n \"acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.042446332383532286\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.3026315789473684,\n \"acc_stderr\": 0.03738520676119667,\n \"acc_norm\": 0.3026315789473684,\n \"acc_norm_stderr\": 0.03738520676119667\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.33962264150943394,\n \"acc_stderr\": 0.029146904747798335,\n \"acc_norm\": 0.33962264150943394,\n \"acc_norm_stderr\": 0.029146904747798335\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2638888888888889,\n \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.2638888888888889,\n \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036622,\n \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036622\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2023121387283237,\n \"acc_stderr\": 0.03063114553919882,\n \"acc_norm\": 0.2023121387283237,\n \"acc_norm_stderr\": 0.03063114553919882\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179961,\n \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179961\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.2765957446808511,\n \"acc_stderr\": 0.029241883869628806,\n \"acc_norm\": 0.2765957446808511,\n \"acc_norm_stderr\": 0.029241883869628806\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.35964912280701755,\n \"acc_stderr\": 0.04514496132873633,\n \"acc_norm\": 0.35964912280701755,\n \"acc_norm_stderr\": 0.04514496132873633\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.42758620689655175,\n \"acc_stderr\": 0.041227371113703316,\n \"acc_norm\": 0.42758620689655175,\n \"acc_norm_stderr\": 0.041227371113703316\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.02256989707491841,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.02256989707491841\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.24603174603174602,\n \"acc_stderr\": 0.03852273364924315,\n \"acc_norm\": 0.24603174603174602,\n \"acc_norm_stderr\": 0.03852273364924315\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.33548387096774196,\n \"acc_stderr\": 0.02686020644472435,\n \"acc_norm\": 0.33548387096774196,\n \"acc_norm_stderr\": 0.02686020644472435\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.2660098522167488,\n \"acc_stderr\": 0.031089826002937523,\n \"acc_norm\": 0.2660098522167488,\n \"acc_norm_stderr\": 0.031089826002937523\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.30303030303030304,\n \"acc_stderr\": 0.035886248000917075,\n \"acc_norm\": 0.30303030303030304,\n \"acc_norm_stderr\": 0.035886248000917075\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.2828282828282828,\n \"acc_stderr\": 0.03208779558786751,\n \"acc_norm\": 0.2828282828282828,\n \"acc_norm_stderr\": 0.03208779558786751\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.3005181347150259,\n \"acc_stderr\": 0.0330881859441575,\n \"acc_norm\": 0.3005181347150259,\n \"acc_norm_stderr\": 0.0330881859441575\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.28974358974358977,\n \"acc_stderr\": 0.02300062824368797,\n \"acc_norm\": 0.28974358974358977,\n \"acc_norm_stderr\": 0.02300062824368797\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.29259259259259257,\n \"acc_stderr\": 0.027738969632176088,\n \"acc_norm\": 0.29259259259259257,\n \"acc_norm_stderr\": 0.027738969632176088\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.31092436974789917,\n \"acc_stderr\": 0.030066761582977934,\n \"acc_norm\": 0.31092436974789917,\n \"acc_norm_stderr\": 0.030066761582977934\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.26605504587155965,\n \"acc_stderr\": 0.018946022322225597,\n \"acc_norm\": 0.26605504587155965,\n \"acc_norm_stderr\": 0.018946022322225597\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.24074074074074073,\n \"acc_stderr\": 0.02915752218460559,\n \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.02915752218460559\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.03132179803083291,\n \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.03132179803083291\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.29957805907172996,\n \"acc_stderr\": 0.029818024749753095,\n \"acc_norm\": 0.29957805907172996,\n \"acc_norm_stderr\": 0.029818024749753095\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3991031390134529,\n \"acc_stderr\": 0.03286745312567961,\n \"acc_norm\": 0.3991031390134529,\n \"acc_norm_stderr\": 0.03286745312567961\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.33587786259541985,\n \"acc_stderr\": 0.041423137719966634,\n \"acc_norm\": 0.33587786259541985,\n \"acc_norm_stderr\": 0.041423137719966634\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.4132231404958678,\n \"acc_stderr\": 0.04495087843548408,\n \"acc_norm\": 0.4132231404958678,\n \"acc_norm_stderr\": 0.04495087843548408\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2962962962962963,\n \"acc_stderr\": 0.04414343666854933,\n \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.04414343666854933\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.26993865030674846,\n \"acc_stderr\": 0.03487825168497892,\n \"acc_norm\": 0.26993865030674846,\n \"acc_norm_stderr\": 0.03487825168497892\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.32142857142857145,\n \"acc_stderr\": 0.0443280405529152,\n \"acc_norm\": 0.32142857142857145,\n \"acc_norm_stderr\": 0.0443280405529152\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.33980582524271846,\n \"acc_stderr\": 0.046897659372781335,\n \"acc_norm\": 0.33980582524271846,\n \"acc_norm_stderr\": 0.046897659372781335\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.4358974358974359,\n \"acc_stderr\": 0.03248577511578401,\n \"acc_norm\": 0.4358974358974359,\n \"acc_norm_stderr\": 0.03248577511578401\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.37292464878671777,\n \"acc_stderr\": 0.01729286826945393,\n \"acc_norm\": 0.37292464878671777,\n \"acc_norm_stderr\": 0.01729286826945393\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.3959537572254335,\n \"acc_stderr\": 0.02632981334194624,\n \"acc_norm\": 0.3959537572254335,\n \"acc_norm_stderr\": 0.02632981334194624\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.35947712418300654,\n \"acc_stderr\": 0.02747596991066095,\n \"acc_norm\": 0.35947712418300654,\n \"acc_norm_stderr\": 0.02747596991066095\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.40514469453376206,\n \"acc_stderr\": 0.02788238379132595,\n \"acc_norm\": 0.40514469453376206,\n \"acc_norm_stderr\": 0.02788238379132595\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.33641975308641975,\n \"acc_stderr\": 0.026289734945952926,\n \"acc_norm\": 0.33641975308641975,\n \"acc_norm_stderr\": 0.026289734945952926\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.29432624113475175,\n \"acc_stderr\": 0.027187127011503786,\n \"acc_norm\": 0.29432624113475175,\n \"acc_norm_stderr\": 0.027187127011503786\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2842242503259452,\n \"acc_stderr\": 0.01151988059651607,\n \"acc_norm\": 0.2842242503259452,\n \"acc_norm_stderr\": 0.01151988059651607\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.02352924218519311,\n \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.02352924218519311\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.3088235294117647,\n \"acc_stderr\": 0.018690850273595284,\n \"acc_norm\": 0.3088235294117647,\n \"acc_norm_stderr\": 0.018690850273595284\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.36363636363636365,\n \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.36363636363636365,\n \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.3142857142857143,\n \"acc_stderr\": 0.02971932942241749,\n \"acc_norm\": 0.3142857142857143,\n \"acc_norm_stderr\": 0.02971932942241749\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.3880597014925373,\n \"acc_stderr\": 0.034457899643627506,\n \"acc_norm\": 0.3880597014925373,\n \"acc_norm_stderr\": 0.034457899643627506\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3313253012048193,\n \"acc_stderr\": 0.03664314777288086,\n \"acc_norm\": 0.3313253012048193,\n \"acc_norm_stderr\": 0.03664314777288086\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.03615507630310934,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.03615507630310934\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2521419828641371,\n \"mc1_stderr\": 0.01520152224629997,\n \"mc2\": 0.400215713503952,\n \"mc2_stderr\": 0.014978959258933158\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5580110497237569,\n \"acc_stderr\": 0.013957584079108994\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.07884761182714177,\n \"acc_stderr\": 0.007423390519873237\n }\n}\n```", "repo_url": "https://huggingface.co/bigcode/starcoderbase", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_14T19_33_07.504814", "path": ["**/details_harness|arc:challenge|25_2024-02-14T19-33-07.504814.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-14T19-33-07.504814.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_14T19_33_07.504814", "path": ["**/details_harness|gsm8k|5_2024-02-14T19-33-07.504814.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-14T19-33-07.504814.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_14T19_33_07.504814", "path": ["**/details_harness|hellaswag|10_2024-02-14T19-33-07.504814.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-14T19-33-07.504814.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_14T19_33_07.504814", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T19-33-07.504814.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T19-33-07.504814.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T19-33-07.504814.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_14T19_33_07.504814", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T19-33-07.504814.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T19-33-07.504814.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_14T19_33_07.504814", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T19-33-07.504814.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T19-33-07.504814.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_14T19_33_07.504814", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T19-33-07.504814.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T19-33-07.504814.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_14T19_33_07.504814", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T19-33-07.504814.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T19-33-07.504814.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_14T19_33_07.504814", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T19-33-07.504814.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T19-33-07.504814.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_14T19_33_07.504814", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T19-33-07.504814.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T19-33-07.504814.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_14T19_33_07.504814", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T19-33-07.504814.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T19-33-07.504814.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_14T19_33_07.504814", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T19-33-07.504814.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T19-33-07.504814.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_14T19_33_07.504814", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T19-33-07.504814.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T19-33-07.504814.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_14T19_33_07.504814", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T19-33-07.504814.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T19-33-07.504814.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_14T19_33_07.504814", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T19-33-07.504814.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T19-33-07.504814.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_14T19_33_07.504814", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T19-33-07.504814.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T19-33-07.504814.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_14T19_33_07.504814", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T19-33-07.504814.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T19-33-07.504814.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_14T19_33_07.504814", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T19-33-07.504814.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T19-33-07.504814.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_14T19_33_07.504814", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T19-33-07.504814.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T19-33-07.504814.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_14T19_33_07.504814", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T19-33-07.504814.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T19-33-07.504814.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_14T19_33_07.504814", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T19-33-07.504814.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T19-33-07.504814.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_14T19_33_07.504814", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T19-33-07.504814.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T19-33-07.504814.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_14T19_33_07.504814", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T19-33-07.504814.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T19-33-07.504814.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_14T19_33_07.504814", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T19-33-07.504814.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T19-33-07.504814.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_14T19_33_07.504814", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T19-33-07.504814.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T19-33-07.504814.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_14T19_33_07.504814", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T19-33-07.504814.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T19-33-07.504814.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_14T19_33_07.504814", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T19-33-07.504814.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T19-33-07.504814.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_14T19_33_07.504814", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T19-33-07.504814.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T19-33-07.504814.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_14T19_33_07.504814", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T19-33-07.504814.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T19-33-07.504814.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_14T19_33_07.504814", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T19-33-07.504814.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T19-33-07.504814.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_14T19_33_07.504814", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T19-33-07.504814.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T19-33-07.504814.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_14T19_33_07.504814", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T19-33-07.504814.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T19-33-07.504814.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_14T19_33_07.504814", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T19-33-07.504814.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T19-33-07.504814.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_14T19_33_07.504814", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T19-33-07.504814.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T19-33-07.504814.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_14T19_33_07.504814", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T19-33-07.504814.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T19-33-07.504814.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_14T19_33_07.504814", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T19-33-07.504814.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T19-33-07.504814.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_14T19_33_07.504814", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T19-33-07.504814.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T19-33-07.504814.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_14T19_33_07.504814", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T19-33-07.504814.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T19-33-07.504814.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_14T19_33_07.504814", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T19-33-07.504814.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T19-33-07.504814.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_14T19_33_07.504814", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T19-33-07.504814.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T19-33-07.504814.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_14T19_33_07.504814", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T19-33-07.504814.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T19-33-07.504814.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_14T19_33_07.504814", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T19-33-07.504814.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T19-33-07.504814.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_14T19_33_07.504814", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T19-33-07.504814.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T19-33-07.504814.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_14T19_33_07.504814", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T19-33-07.504814.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T19-33-07.504814.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_14T19_33_07.504814", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T19-33-07.504814.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T19-33-07.504814.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_14T19_33_07.504814", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T19-33-07.504814.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T19-33-07.504814.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_14T19_33_07.504814", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T19-33-07.504814.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T19-33-07.504814.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_14T19_33_07.504814", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T19-33-07.504814.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T19-33-07.504814.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_14T19_33_07.504814", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T19-33-07.504814.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T19-33-07.504814.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_14T19_33_07.504814", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T19-33-07.504814.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T19-33-07.504814.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_14T19_33_07.504814", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T19-33-07.504814.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T19-33-07.504814.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_14T19_33_07.504814", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T19-33-07.504814.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T19-33-07.504814.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_14T19_33_07.504814", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T19-33-07.504814.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T19-33-07.504814.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_14T19_33_07.504814", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T19-33-07.504814.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T19-33-07.504814.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_14T19_33_07.504814", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T19-33-07.504814.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T19-33-07.504814.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_14T19_33_07.504814", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T19-33-07.504814.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T19-33-07.504814.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_14T19_33_07.504814", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T19-33-07.504814.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T19-33-07.504814.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_14T19_33_07.504814", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T19-33-07.504814.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T19-33-07.504814.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_14T19_33_07.504814", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T19-33-07.504814.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T19-33-07.504814.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_14T19_33_07.504814", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T19-33-07.504814.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T19-33-07.504814.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_14T19_33_07.504814", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T19-33-07.504814.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T19-33-07.504814.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_14T19_33_07.504814", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T19-33-07.504814.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T19-33-07.504814.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_14T19_33_07.504814", "path": ["**/details_harness|winogrande|5_2024-02-14T19-33-07.504814.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-14T19-33-07.504814.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_14T19_33_07.504814", "path": ["results_2024-02-14T19-33-07.504814.parquet"]}, {"split": "latest", "path": ["results_2024-02-14T19-33-07.504814.parquet"]}]}]} | 2024-02-14T19:35:59+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of bigcode/starcoderbase
Dataset automatically created during the evaluation run of model bigcode/starcoderbase on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-14T19:33:07.504814(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of bigcode/starcoderbase\n\n\n\nDataset automatically created during the evaluation run of model bigcode/starcoderbase on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T19:33:07.504814(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of bigcode/starcoderbase\n\n\n\nDataset automatically created during the evaluation run of model bigcode/starcoderbase on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T19:33:07.504814(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
db6ab3b4ced3b27a5fca34a01583a3f1b3b716ce | # Args
```python
{'base_model': 'mistralai/Mistral-7B-v0.1',
'check_length_correctness': True,
'debug': False,
'hf_entity': 'vwxyzjn',
'params': TaskQueryHParams(length=3000,
format_str='SUBREDDIT: r/{subreddit}\n'
'\n'
'TITLE: {title}\n'
'\n'
'POST: {post}\n'
'\n'
'TL;DR:',
truncate_field='post',
truncate_text='\n',
padding='pad_token',
pad_token=[32000],
pad_side='left',
max_query_length=3000,
max_sft_query_response_length=4000,
max_sft_response_length=1500,
max_rm_query_response_length=4500),
'push_to_hub': True}
```
| vwxyzjn/ultrachat_200k_filtered_1707945637 | [
"region:us"
] | 2024-02-14T21:23:31+00:00 | {"dataset_info": {"features": [{"name": "prompt", "dtype": "string"}, {"name": "prompt_id", "dtype": "string"}, {"name": "messages", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "query", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "query_token", "sequence": "int64"}, {"name": "query_reference_response", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "query_reference_response_token", "sequence": "int64"}, {"name": "query_reference_response_token_len", "dtype": "int64"}, {"name": "query_token_len", "dtype": "int64"}, {"name": "reference_response", "struct": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "reference_response_token", "sequence": "int64"}, {"name": "reference_response_token_len", "dtype": "int64"}], "splits": [{"name": "test_sft", "num_bytes": 1913750440.5861099, "num_examples": 22991}, {"name": "train_sft", "num_bytes": 17223457215.44901, "num_examples": 206698}], "download_size": 3300532500, "dataset_size": 19137207656.03512}} | 2024-02-14T21:28:19+00:00 | [] | [] | TAGS
#region-us
| # Args
| [
"# Args"
] | [
"TAGS\n#region-us \n",
"# Args"
] |
df0772de6dac786194c1bf6489c07eb86dcd3465 | # Dataset Card for "ultrafeedback_binarized_1707945637"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | vwxyzjn/ultrafeedback_binarized_1707945637 | [
"region:us"
] | 2024-02-14T21:27:44+00:00 | {"dataset_info": {"features": [{"name": "prompt", "dtype": "string"}, {"name": "prompt_id", "dtype": "string"}, {"name": "chosen", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "rejected", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "messages", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "score_chosen", "dtype": "float64"}, {"name": "score_rejected", "dtype": "float64"}, {"name": "query", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "query_token", "sequence": "int64"}, {"name": "query_token_len", "dtype": "int64"}, {"name": "chosen_token", "sequence": "int64"}, {"name": "chosen_token_len", "dtype": "int64"}, {"name": "chosen_response_token", "sequence": "int64"}, {"name": "chosen_response_token_len", "dtype": "int64"}, {"name": "rejected_token", "sequence": "int64"}, {"name": "rejected_token_len", "dtype": "int64"}, {"name": "rejected_response_token", "sequence": "int64"}, {"name": "rejected_response_token_len", "dtype": "int64"}], "splits": [{"name": "test_prefs", "num_bytes": 216278383.0, "num_examples": 2000}, {"name": "train_prefs", "num_bytes": 6612816240.948507, "num_examples": 61119}], "download_size": 477181463, "dataset_size": 6829094623.948507}} | 2024-02-14T21:28:18+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "ultrafeedback_binarized_1707945637"
More Information needed | [
"# Dataset Card for \"ultrafeedback_binarized_1707945637\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"ultrafeedback_binarized_1707945637\"\n\nMore Information needed"
] |
fa34d3a4c2cc6a79c0418654e52236164fd43d75 | Dragon_v1_training dataset | Terresa/Dragon_v1_data | [
"region:us"
] | 2024-02-14T21:30:28+00:00 | {} | 2024-02-14T21:31:55+00:00 | [] | [] | TAGS
#region-us
| Dragon_v1_training dataset | [] | [
"TAGS\n#region-us \n"
] |
5d8ad491641c3b5f3104428b9e21ccbddad1f6d6 |
# Dataset Card for Evaluation run of bigcode/starcoderbase-1b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [bigcode/starcoderbase-1b](https://huggingface.co/bigcode/starcoderbase-1b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_bigcode__starcoderbase-1b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-14T21:51:42.530406](https://huggingface.co/datasets/open-llm-leaderboard/details_bigcode__starcoderbase-1b/blob/main/results_2024-02-14T21-51-42.530406.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2656327745815198,
"acc_stderr": 0.03133338710793329,
"acc_norm": 0.26735820509373515,
"acc_norm_stderr": 0.032129928643110324,
"mc1": 0.2729498164014688,
"mc1_stderr": 0.01559475363200652,
"mc2": 0.4578928664903403,
"mc2_stderr": 0.015155546755030565
},
"harness|arc:challenge|25": {
"acc": 0.18686006825938567,
"acc_stderr": 0.011391015649694386,
"acc_norm": 0.22696245733788395,
"acc_norm_stderr": 0.012240491536132863
},
"harness|hellaswag|10": {
"acc": 0.30392352121091415,
"acc_stderr": 0.0045901000501988275,
"acc_norm": 0.3430591515634336,
"acc_norm_stderr": 0.004737608340163395
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768081,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768081
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.038850042458002526,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.038850042458002526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.19736842105263158,
"acc_stderr": 0.03238981601699397,
"acc_norm": 0.19736842105263158,
"acc_norm_stderr": 0.03238981601699397
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.27547169811320754,
"acc_stderr": 0.027495663683724067,
"acc_norm": 0.27547169811320754,
"acc_norm_stderr": 0.027495663683724067
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2361111111111111,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.2361111111111111,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952365,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952365
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.1907514450867052,
"acc_stderr": 0.029957851329869337,
"acc_norm": 0.1907514450867052,
"acc_norm_stderr": 0.029957851329869337
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.04533838195929776,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.04533838195929776
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3276595744680851,
"acc_stderr": 0.030683020843231004,
"acc_norm": 0.3276595744680851,
"acc_norm_stderr": 0.030683020843231004
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.041424397194893624,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.041424397194893624
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.296551724137931,
"acc_stderr": 0.03806142687309994,
"acc_norm": 0.296551724137931,
"acc_norm_stderr": 0.03806142687309994
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.23544973544973544,
"acc_stderr": 0.02185150982203171,
"acc_norm": 0.23544973544973544,
"acc_norm_stderr": 0.02185150982203171
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.03718489006818115,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.03718489006818115
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2838709677419355,
"acc_stderr": 0.025649381063029268,
"acc_norm": 0.2838709677419355,
"acc_norm_stderr": 0.025649381063029268
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.270935960591133,
"acc_stderr": 0.031270907132976984,
"acc_norm": 0.270935960591133,
"acc_norm_stderr": 0.031270907132976984
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.23030303030303031,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.23030303030303031,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.22727272727272727,
"acc_stderr": 0.029857515673386414,
"acc_norm": 0.22727272727272727,
"acc_norm_stderr": 0.029857515673386414
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.22797927461139897,
"acc_stderr": 0.03027690994517826,
"acc_norm": 0.22797927461139897,
"acc_norm_stderr": 0.03027690994517826
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.27692307692307694,
"acc_stderr": 0.022688042352424994,
"acc_norm": 0.27692307692307694,
"acc_norm_stderr": 0.022688042352424994
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2740740740740741,
"acc_stderr": 0.027195934804085626,
"acc_norm": 0.2740740740740741,
"acc_norm_stderr": 0.027195934804085626
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.23109243697478993,
"acc_stderr": 0.02738140692786897,
"acc_norm": 0.23109243697478993,
"acc_norm_stderr": 0.02738140692786897
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2251655629139073,
"acc_stderr": 0.03410435282008936,
"acc_norm": 0.2251655629139073,
"acc_norm_stderr": 0.03410435282008936
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.23853211009174313,
"acc_stderr": 0.018272575810231863,
"acc_norm": 0.23853211009174313,
"acc_norm_stderr": 0.018272575810231863
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.38425925925925924,
"acc_stderr": 0.03317354514310742,
"acc_norm": 0.38425925925925924,
"acc_norm_stderr": 0.03317354514310742
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.18627450980392157,
"acc_stderr": 0.02732547096671631,
"acc_norm": 0.18627450980392157,
"acc_norm_stderr": 0.02732547096671631
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2616033755274262,
"acc_stderr": 0.028609516716994934,
"acc_norm": 0.2616033755274262,
"acc_norm_stderr": 0.028609516716994934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.31390134529147984,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.31390134529147984,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.1984732824427481,
"acc_stderr": 0.034981493854624734,
"acc_norm": 0.1984732824427481,
"acc_norm_stderr": 0.034981493854624734
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2644628099173554,
"acc_stderr": 0.04026187527591206,
"acc_norm": 0.2644628099173554,
"acc_norm_stderr": 0.04026187527591206
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.3055555555555556,
"acc_stderr": 0.044531975073749834,
"acc_norm": 0.3055555555555556,
"acc_norm_stderr": 0.044531975073749834
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2085889570552147,
"acc_stderr": 0.03192193448934723,
"acc_norm": 0.2085889570552147,
"acc_norm_stderr": 0.03192193448934723
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.2524271844660194,
"acc_stderr": 0.04301250399690875,
"acc_norm": 0.2524271844660194,
"acc_norm_stderr": 0.04301250399690875
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.1752136752136752,
"acc_stderr": 0.024904439098918218,
"acc_norm": 0.1752136752136752,
"acc_norm_stderr": 0.024904439098918218
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2822477650063857,
"acc_stderr": 0.016095302969878576,
"acc_norm": 0.2822477650063857,
"acc_norm_stderr": 0.016095302969878576
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.23410404624277456,
"acc_stderr": 0.022797110278071145,
"acc_norm": 0.23410404624277456,
"acc_norm_stderr": 0.022797110278071145
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.27124183006535946,
"acc_stderr": 0.025457756696667878,
"acc_norm": 0.27124183006535946,
"acc_norm_stderr": 0.025457756696667878
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2797427652733119,
"acc_stderr": 0.02549425935069491,
"acc_norm": 0.2797427652733119,
"acc_norm_stderr": 0.02549425935069491
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.27469135802469136,
"acc_stderr": 0.02483605786829468,
"acc_norm": 0.27469135802469136,
"acc_norm_stderr": 0.02483605786829468
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2730496453900709,
"acc_stderr": 0.02657786094330785,
"acc_norm": 0.2730496453900709,
"acc_norm_stderr": 0.02657786094330785
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.25749674054758803,
"acc_stderr": 0.01116770601490415,
"acc_norm": 0.25749674054758803,
"acc_norm_stderr": 0.01116770601490415
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4375,
"acc_stderr": 0.030134614954403924,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.030134614954403924
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.22712418300653595,
"acc_stderr": 0.01694985327921237,
"acc_norm": 0.22712418300653595,
"acc_norm_stderr": 0.01694985327921237
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.3,
"acc_stderr": 0.04389311454644286,
"acc_norm": 0.3,
"acc_norm_stderr": 0.04389311454644286
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3183673469387755,
"acc_stderr": 0.029822533793982045,
"acc_norm": 0.3183673469387755,
"acc_norm_stderr": 0.029822533793982045
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.030360490154014645,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.030360490154014645
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3493975903614458,
"acc_stderr": 0.0371172519074075,
"acc_norm": 0.3493975903614458,
"acc_norm_stderr": 0.0371172519074075
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.03377310252209194,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.03377310252209194
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2729498164014688,
"mc1_stderr": 0.01559475363200652,
"mc2": 0.4578928664903403,
"mc2_stderr": 0.015155546755030565
},
"harness|winogrande|5": {
"acc": 0.4996053670086819,
"acc_stderr": 0.014052481306049516
},
"harness|gsm8k|5": {
"acc": 0.009097801364670205,
"acc_stderr": 0.0026153265107756725
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_bigcode__starcoderbase-1b | [
"region:us"
] | 2024-02-14T21:54:09+00:00 | {"pretty_name": "Evaluation run of bigcode/starcoderbase-1b", "dataset_summary": "Dataset automatically created during the evaluation run of model [bigcode/starcoderbase-1b](https://huggingface.co/bigcode/starcoderbase-1b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_bigcode__starcoderbase-1b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-14T21:51:42.530406](https://huggingface.co/datasets/open-llm-leaderboard/details_bigcode__starcoderbase-1b/blob/main/results_2024-02-14T21-51-42.530406.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2656327745815198,\n \"acc_stderr\": 0.03133338710793329,\n \"acc_norm\": 0.26735820509373515,\n \"acc_norm_stderr\": 0.032129928643110324,\n \"mc1\": 0.2729498164014688,\n \"mc1_stderr\": 0.01559475363200652,\n \"mc2\": 0.4578928664903403,\n \"mc2_stderr\": 0.015155546755030565\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.18686006825938567,\n \"acc_stderr\": 0.011391015649694386,\n \"acc_norm\": 0.22696245733788395,\n \"acc_norm_stderr\": 0.012240491536132863\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.30392352121091415,\n \"acc_stderr\": 0.0045901000501988275,\n \"acc_norm\": 0.3430591515634336,\n \"acc_norm_stderr\": 0.004737608340163395\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768081,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768081\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2814814814814815,\n \"acc_stderr\": 0.038850042458002526,\n \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.038850042458002526\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.19736842105263158,\n \"acc_stderr\": 0.03238981601699397,\n \"acc_norm\": 0.19736842105263158,\n \"acc_norm_stderr\": 0.03238981601699397\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.27547169811320754,\n \"acc_stderr\": 0.027495663683724067,\n \"acc_norm\": 0.27547169811320754,\n \"acc_norm_stderr\": 0.027495663683724067\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2361111111111111,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.2361111111111111,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.1907514450867052,\n \"acc_stderr\": 0.029957851329869337,\n \"acc_norm\": 0.1907514450867052,\n \"acc_norm_stderr\": 0.029957851329869337\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.04533838195929776,\n \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.04533838195929776\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.3276595744680851,\n \"acc_stderr\": 0.030683020843231004,\n \"acc_norm\": 0.3276595744680851,\n \"acc_norm_stderr\": 0.030683020843231004\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n \"acc_stderr\": 0.041424397194893624,\n \"acc_norm\": 0.2631578947368421,\n \"acc_norm_stderr\": 0.041424397194893624\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.296551724137931,\n \"acc_stderr\": 0.03806142687309994,\n \"acc_norm\": 0.296551724137931,\n \"acc_norm_stderr\": 0.03806142687309994\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.23544973544973544,\n \"acc_stderr\": 0.02185150982203171,\n \"acc_norm\": 0.23544973544973544,\n \"acc_norm_stderr\": 0.02185150982203171\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.03718489006818115,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.03718489006818115\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2838709677419355,\n \"acc_stderr\": 0.025649381063029268,\n \"acc_norm\": 0.2838709677419355,\n \"acc_norm_stderr\": 0.025649381063029268\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.270935960591133,\n \"acc_stderr\": 0.031270907132976984,\n \"acc_norm\": 0.270935960591133,\n \"acc_norm_stderr\": 0.031270907132976984\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.23030303030303031,\n \"acc_stderr\": 0.0328766675860349,\n \"acc_norm\": 0.23030303030303031,\n \"acc_norm_stderr\": 0.0328766675860349\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.22727272727272727,\n \"acc_stderr\": 0.029857515673386414,\n \"acc_norm\": 0.22727272727272727,\n \"acc_norm_stderr\": 0.029857515673386414\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.22797927461139897,\n \"acc_stderr\": 0.03027690994517826,\n \"acc_norm\": 0.22797927461139897,\n \"acc_norm_stderr\": 0.03027690994517826\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.27692307692307694,\n \"acc_stderr\": 0.022688042352424994,\n \"acc_norm\": 0.27692307692307694,\n \"acc_norm_stderr\": 0.022688042352424994\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2740740740740741,\n \"acc_stderr\": 0.027195934804085626,\n \"acc_norm\": 0.2740740740740741,\n \"acc_norm_stderr\": 0.027195934804085626\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.23109243697478993,\n \"acc_stderr\": 0.02738140692786897,\n \"acc_norm\": 0.23109243697478993,\n \"acc_norm_stderr\": 0.02738140692786897\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2251655629139073,\n \"acc_stderr\": 0.03410435282008936,\n \"acc_norm\": 0.2251655629139073,\n \"acc_norm_stderr\": 0.03410435282008936\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.23853211009174313,\n \"acc_stderr\": 0.018272575810231863,\n \"acc_norm\": 0.23853211009174313,\n \"acc_norm_stderr\": 0.018272575810231863\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.38425925925925924,\n \"acc_stderr\": 0.03317354514310742,\n \"acc_norm\": 0.38425925925925924,\n \"acc_norm_stderr\": 0.03317354514310742\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.18627450980392157,\n \"acc_stderr\": 0.02732547096671631,\n \"acc_norm\": 0.18627450980392157,\n \"acc_norm_stderr\": 0.02732547096671631\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.2616033755274262,\n \"acc_stderr\": 0.028609516716994934,\n \"acc_norm\": 0.2616033755274262,\n \"acc_norm_stderr\": 0.028609516716994934\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.1984732824427481,\n \"acc_stderr\": 0.034981493854624734,\n \"acc_norm\": 0.1984732824427481,\n \"acc_norm_stderr\": 0.034981493854624734\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2644628099173554,\n \"acc_stderr\": 0.04026187527591206,\n \"acc_norm\": 0.2644628099173554,\n \"acc_norm_stderr\": 0.04026187527591206\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.3055555555555556,\n \"acc_stderr\": 0.044531975073749834,\n \"acc_norm\": 0.3055555555555556,\n \"acc_norm_stderr\": 0.044531975073749834\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.2085889570552147,\n \"acc_stderr\": 0.03192193448934723,\n \"acc_norm\": 0.2085889570552147,\n \"acc_norm_stderr\": 0.03192193448934723\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.2524271844660194,\n \"acc_stderr\": 0.04301250399690875,\n \"acc_norm\": 0.2524271844660194,\n \"acc_norm_stderr\": 0.04301250399690875\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.1752136752136752,\n \"acc_stderr\": 0.024904439098918218,\n \"acc_norm\": 0.1752136752136752,\n \"acc_norm_stderr\": 0.024904439098918218\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2822477650063857,\n \"acc_stderr\": 0.016095302969878576,\n \"acc_norm\": 0.2822477650063857,\n \"acc_norm_stderr\": 0.016095302969878576\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.23410404624277456,\n \"acc_stderr\": 0.022797110278071145,\n \"acc_norm\": 0.23410404624277456,\n \"acc_norm_stderr\": 0.022797110278071145\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.27124183006535946,\n \"acc_stderr\": 0.025457756696667878,\n \"acc_norm\": 0.27124183006535946,\n \"acc_norm_stderr\": 0.025457756696667878\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2797427652733119,\n \"acc_stderr\": 0.02549425935069491,\n \"acc_norm\": 0.2797427652733119,\n \"acc_norm_stderr\": 0.02549425935069491\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.27469135802469136,\n \"acc_stderr\": 0.02483605786829468,\n \"acc_norm\": 0.27469135802469136,\n \"acc_norm_stderr\": 0.02483605786829468\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.2730496453900709,\n \"acc_stderr\": 0.02657786094330785,\n \"acc_norm\": 0.2730496453900709,\n \"acc_norm_stderr\": 0.02657786094330785\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.25749674054758803,\n \"acc_stderr\": 0.01116770601490415,\n \"acc_norm\": 0.25749674054758803,\n \"acc_norm_stderr\": 0.01116770601490415\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.030134614954403924,\n \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.030134614954403924\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.22712418300653595,\n \"acc_stderr\": 0.01694985327921237,\n \"acc_norm\": 0.22712418300653595,\n \"acc_norm_stderr\": 0.01694985327921237\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.04389311454644286,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.04389311454644286\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.3183673469387755,\n \"acc_stderr\": 0.029822533793982045,\n \"acc_norm\": 0.3183673469387755,\n \"acc_norm_stderr\": 0.029822533793982045\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.030360490154014645,\n \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.030360490154014645\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3493975903614458,\n \"acc_stderr\": 0.0371172519074075,\n \"acc_norm\": 0.3493975903614458,\n \"acc_norm_stderr\": 0.0371172519074075\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.2631578947368421,\n \"acc_stderr\": 0.03377310252209194,\n \"acc_norm\": 0.2631578947368421,\n \"acc_norm_stderr\": 0.03377310252209194\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2729498164014688,\n \"mc1_stderr\": 0.01559475363200652,\n \"mc2\": 0.4578928664903403,\n \"mc2_stderr\": 0.015155546755030565\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.4996053670086819,\n \"acc_stderr\": 0.014052481306049516\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.009097801364670205,\n \"acc_stderr\": 0.0026153265107756725\n }\n}\n```", "repo_url": "https://huggingface.co/bigcode/starcoderbase-1b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_14T21_51_42.530406", "path": ["**/details_harness|arc:challenge|25_2024-02-14T21-51-42.530406.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-14T21-51-42.530406.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_14T21_51_42.530406", "path": ["**/details_harness|gsm8k|5_2024-02-14T21-51-42.530406.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-14T21-51-42.530406.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_14T21_51_42.530406", "path": ["**/details_harness|hellaswag|10_2024-02-14T21-51-42.530406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-14T21-51-42.530406.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_14T21_51_42.530406", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T21-51-42.530406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T21-51-42.530406.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T21-51-42.530406.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_14T21_51_42.530406", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T21-51-42.530406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T21-51-42.530406.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_14T21_51_42.530406", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T21-51-42.530406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T21-51-42.530406.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_14T21_51_42.530406", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T21-51-42.530406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T21-51-42.530406.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_14T21_51_42.530406", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T21-51-42.530406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T21-51-42.530406.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_14T21_51_42.530406", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T21-51-42.530406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T21-51-42.530406.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_14T21_51_42.530406", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T21-51-42.530406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T21-51-42.530406.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_14T21_51_42.530406", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T21-51-42.530406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T21-51-42.530406.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_14T21_51_42.530406", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T21-51-42.530406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T21-51-42.530406.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_14T21_51_42.530406", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T21-51-42.530406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T21-51-42.530406.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_14T21_51_42.530406", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T21-51-42.530406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T21-51-42.530406.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_14T21_51_42.530406", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T21-51-42.530406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T21-51-42.530406.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_14T21_51_42.530406", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T21-51-42.530406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T21-51-42.530406.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_14T21_51_42.530406", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T21-51-42.530406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T21-51-42.530406.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_14T21_51_42.530406", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T21-51-42.530406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T21-51-42.530406.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_14T21_51_42.530406", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T21-51-42.530406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T21-51-42.530406.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_14T21_51_42.530406", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T21-51-42.530406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T21-51-42.530406.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_14T21_51_42.530406", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T21-51-42.530406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T21-51-42.530406.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_14T21_51_42.530406", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T21-51-42.530406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T21-51-42.530406.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_14T21_51_42.530406", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T21-51-42.530406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T21-51-42.530406.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_14T21_51_42.530406", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T21-51-42.530406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T21-51-42.530406.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_14T21_51_42.530406", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T21-51-42.530406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T21-51-42.530406.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_14T21_51_42.530406", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T21-51-42.530406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T21-51-42.530406.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_14T21_51_42.530406", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T21-51-42.530406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T21-51-42.530406.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_14T21_51_42.530406", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T21-51-42.530406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T21-51-42.530406.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_14T21_51_42.530406", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T21-51-42.530406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T21-51-42.530406.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_14T21_51_42.530406", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T21-51-42.530406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T21-51-42.530406.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_14T21_51_42.530406", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T21-51-42.530406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T21-51-42.530406.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_14T21_51_42.530406", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T21-51-42.530406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T21-51-42.530406.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_14T21_51_42.530406", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T21-51-42.530406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T21-51-42.530406.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_14T21_51_42.530406", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T21-51-42.530406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T21-51-42.530406.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_14T21_51_42.530406", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T21-51-42.530406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T21-51-42.530406.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_14T21_51_42.530406", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T21-51-42.530406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T21-51-42.530406.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_14T21_51_42.530406", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T21-51-42.530406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T21-51-42.530406.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_14T21_51_42.530406", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T21-51-42.530406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T21-51-42.530406.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_14T21_51_42.530406", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T21-51-42.530406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T21-51-42.530406.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_14T21_51_42.530406", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T21-51-42.530406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T21-51-42.530406.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_14T21_51_42.530406", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T21-51-42.530406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T21-51-42.530406.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_14T21_51_42.530406", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T21-51-42.530406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T21-51-42.530406.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_14T21_51_42.530406", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T21-51-42.530406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T21-51-42.530406.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_14T21_51_42.530406", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T21-51-42.530406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T21-51-42.530406.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_14T21_51_42.530406", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T21-51-42.530406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T21-51-42.530406.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_14T21_51_42.530406", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T21-51-42.530406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T21-51-42.530406.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_14T21_51_42.530406", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T21-51-42.530406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T21-51-42.530406.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_14T21_51_42.530406", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T21-51-42.530406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T21-51-42.530406.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_14T21_51_42.530406", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T21-51-42.530406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T21-51-42.530406.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_14T21_51_42.530406", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T21-51-42.530406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T21-51-42.530406.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_14T21_51_42.530406", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T21-51-42.530406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T21-51-42.530406.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_14T21_51_42.530406", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T21-51-42.530406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T21-51-42.530406.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_14T21_51_42.530406", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T21-51-42.530406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T21-51-42.530406.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_14T21_51_42.530406", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T21-51-42.530406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T21-51-42.530406.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_14T21_51_42.530406", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T21-51-42.530406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T21-51-42.530406.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_14T21_51_42.530406", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T21-51-42.530406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T21-51-42.530406.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_14T21_51_42.530406", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T21-51-42.530406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T21-51-42.530406.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_14T21_51_42.530406", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T21-51-42.530406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T21-51-42.530406.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_14T21_51_42.530406", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T21-51-42.530406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T21-51-42.530406.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_14T21_51_42.530406", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T21-51-42.530406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T21-51-42.530406.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_14T21_51_42.530406", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T21-51-42.530406.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T21-51-42.530406.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_14T21_51_42.530406", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T21-51-42.530406.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T21-51-42.530406.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_14T21_51_42.530406", "path": ["**/details_harness|winogrande|5_2024-02-14T21-51-42.530406.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-14T21-51-42.530406.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_14T21_51_42.530406", "path": ["results_2024-02-14T21-51-42.530406.parquet"]}, {"split": "latest", "path": ["results_2024-02-14T21-51-42.530406.parquet"]}]}]} | 2024-02-14T21:54:32+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of bigcode/starcoderbase-1b
Dataset automatically created during the evaluation run of model bigcode/starcoderbase-1b on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-14T21:51:42.530406(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of bigcode/starcoderbase-1b\n\n\n\nDataset automatically created during the evaluation run of model bigcode/starcoderbase-1b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T21:51:42.530406(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of bigcode/starcoderbase-1b\n\n\n\nDataset automatically created during the evaluation run of model bigcode/starcoderbase-1b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T21:51:42.530406(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
4b9d4553a0bba954f47d0367ebdcfabed8ad58dc | # Args
```python
{'base_model': 'mistralai/Mistral-7B-v0.1',
'check_length_correctness': True,
'debug': False,
'hf_entity': 'vwxyzjn',
'params': TaskQueryHParams(length=3000,
format_str='SUBREDDIT: r/{subreddit}\n'
'\n'
'TITLE: {title}\n'
'\n'
'POST: {post}\n'
'\n'
'TL;DR:',
truncate_field='post',
truncate_text='\n',
padding='pad_token',
pad_token=[32000],
pad_side='left',
max_query_length=3000,
max_sft_query_response_length=4000,
max_sft_response_length=1500,
max_rm_query_response_length=4500),
'push_to_hub': True}
```
| vwxyzjn/ultrachat_200k_filtered_1707947544 | [
"region:us"
] | 2024-02-14T21:57:43+00:00 | {"dataset_info": {"features": [{"name": "prompt", "dtype": "string"}, {"name": "prompt_id", "dtype": "string"}, {"name": "messages", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "query", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "query_token", "sequence": "int64"}, {"name": "query_reference_response", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "query_reference_response_token", "sequence": "int64"}, {"name": "query_reference_response_token_len", "dtype": "int64"}, {"name": "query_token_len", "dtype": "int64"}, {"name": "reference_response", "struct": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "reference_response_token", "sequence": "int64"}, {"name": "reference_response_token_len", "dtype": "int64"}], "splits": [{"name": "test_sft", "num_bytes": 1982888370.9168758, "num_examples": 22991}, {"name": "train_sft", "num_bytes": 17846869528.524822, "num_examples": 206698}], "download_size": 3299597538, "dataset_size": 19829757899.441696}} | 2024-02-14T22:02:32+00:00 | [] | [] | TAGS
#region-us
| # Args
| [
"# Args"
] | [
"TAGS\n#region-us \n",
"# Args"
] |
f2544660ed3a5a353a6c1392a3a6f0bc3f281455 | # Dataset Card for "ultrafeedback_binarized_1707947544"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | vwxyzjn/ultrafeedback_binarized_1707947544 | [
"region:us"
] | 2024-02-14T22:01:56+00:00 | {"dataset_info": {"features": [{"name": "prompt", "dtype": "string"}, {"name": "prompt_id", "dtype": "string"}, {"name": "chosen", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "rejected", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "messages", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "score_chosen", "dtype": "float64"}, {"name": "score_rejected", "dtype": "float64"}, {"name": "query", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "query_token", "sequence": "int64"}, {"name": "query_token_len", "dtype": "int64"}, {"name": "chosen_token", "sequence": "int64"}, {"name": "chosen_token_len", "dtype": "int64"}, {"name": "chosen_response_token", "sequence": "int64"}, {"name": "chosen_response_token_len", "dtype": "int64"}, {"name": "rejected_token", "sequence": "int64"}, {"name": "rejected_token_len", "dtype": "int64"}, {"name": "rejected_response_token", "sequence": "int64"}, {"name": "rejected_response_token_len", "dtype": "int64"}], "splits": [{"name": "test_prefs", "num_bytes": 216278383.0, "num_examples": 2000}, {"name": "train_prefs", "num_bytes": 6612816240.948507, "num_examples": 61119}], "download_size": 477181463, "dataset_size": 6829094623.948507}} | 2024-02-14T22:02:31+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "ultrafeedback_binarized_1707947544"
More Information needed | [
"# Dataset Card for \"ultrafeedback_binarized_1707947544\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"ultrafeedback_binarized_1707947544\"\n\nMore Information needed"
] |
f4db1160f2be82a48b45c691ec9f171996f1c966 |
# Dataset Card for Evaluation run of bigcode/starcoderbase-3b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [bigcode/starcoderbase-3b](https://huggingface.co/bigcode/starcoderbase-3b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_bigcode__starcoderbase-3b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-14T22:11:18.391995](https://huggingface.co/datasets/open-llm-leaderboard/details_bigcode__starcoderbase-3b/blob/main/results_2024-02-14T22-11-18.391995.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2733652033322023,
"acc_stderr": 0.03160388918604069,
"acc_norm": 0.27526586885613774,
"acc_norm_stderr": 0.03238961456755636,
"mc1": 0.25703794369645044,
"mc1_stderr": 0.01529807750948508,
"mc2": 0.4305451198786013,
"mc2_stderr": 0.014753843652404446
},
"harness|arc:challenge|25": {
"acc": 0.2226962457337884,
"acc_stderr": 0.01215831477482991,
"acc_norm": 0.25853242320819114,
"acc_norm_stderr": 0.012794553754288675
},
"harness|hellaswag|10": {
"acc": 0.33260306711810395,
"acc_stderr": 0.004701828071992653,
"acc_norm": 0.3910575582553276,
"acc_norm_stderr": 0.004869899297734546
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768081,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768081
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.28888888888888886,
"acc_stderr": 0.0391545063041425,
"acc_norm": 0.28888888888888886,
"acc_norm_stderr": 0.0391545063041425
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.19736842105263158,
"acc_stderr": 0.03238981601699397,
"acc_norm": 0.19736842105263158,
"acc_norm_stderr": 0.03238981601699397
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.29056603773584905,
"acc_stderr": 0.02794321998933713,
"acc_norm": 0.29056603773584905,
"acc_norm_stderr": 0.02794321998933713
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3263888888888889,
"acc_stderr": 0.03921067198982266,
"acc_norm": 0.3263888888888889,
"acc_norm_stderr": 0.03921067198982266
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542126,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542126
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.23121387283236994,
"acc_stderr": 0.0321473730202947,
"acc_norm": 0.23121387283236994,
"acc_norm_stderr": 0.0321473730202947
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.043364327079931785,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.043364327079931785
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.33617021276595743,
"acc_stderr": 0.030881618520676942,
"acc_norm": 0.33617021276595743,
"acc_norm_stderr": 0.030881618520676942
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159394,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159394
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.20689655172413793,
"acc_stderr": 0.03375672449560553,
"acc_norm": 0.20689655172413793,
"acc_norm_stderr": 0.03375672449560553
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2698412698412698,
"acc_stderr": 0.022860838309232072,
"acc_norm": 0.2698412698412698,
"acc_norm_stderr": 0.022860838309232072
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.037184890068181146,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.037184890068181146
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2838709677419355,
"acc_stderr": 0.02564938106302928,
"acc_norm": 0.2838709677419355,
"acc_norm_stderr": 0.02564938106302928
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.030108330718011625,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.030108330718011625
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.24848484848484848,
"acc_stderr": 0.03374402644139404,
"acc_norm": 0.24848484848484848,
"acc_norm_stderr": 0.03374402644139404
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2828282828282828,
"acc_stderr": 0.03208779558786751,
"acc_norm": 0.2828282828282828,
"acc_norm_stderr": 0.03208779558786751
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.25906735751295334,
"acc_stderr": 0.03161877917935411,
"acc_norm": 0.25906735751295334,
"acc_norm_stderr": 0.03161877917935411
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.29743589743589743,
"acc_stderr": 0.023177408131465942,
"acc_norm": 0.29743589743589743,
"acc_norm_stderr": 0.023177408131465942
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.026719240783712177,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.026719240783712177
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.33613445378151263,
"acc_stderr": 0.030684737115135356,
"acc_norm": 0.33613445378151263,
"acc_norm_stderr": 0.030684737115135356
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.25504587155963304,
"acc_stderr": 0.01868850085653584,
"acc_norm": 0.25504587155963304,
"acc_norm_stderr": 0.01868850085653584
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4027777777777778,
"acc_stderr": 0.03344887382997866,
"acc_norm": 0.4027777777777778,
"acc_norm_stderr": 0.03344887382997866
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.027865942286639325,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.027865942286639325
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.26582278481012656,
"acc_stderr": 0.028756799629658342,
"acc_norm": 0.26582278481012656,
"acc_norm_stderr": 0.028756799629658342
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3632286995515695,
"acc_stderr": 0.032277904428505,
"acc_norm": 0.3632286995515695,
"acc_norm_stderr": 0.032277904428505
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2066115702479339,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.2066115702479339,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2037037037037037,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.2037037037037037,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.25153374233128833,
"acc_stderr": 0.034089978868575295,
"acc_norm": 0.25153374233128833,
"acc_norm_stderr": 0.034089978868575295
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.042878587513404544,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.042878587513404544
},
"harness|hendrycksTest-management|5": {
"acc": 0.2621359223300971,
"acc_stderr": 0.04354631077260597,
"acc_norm": 0.2621359223300971,
"acc_norm_stderr": 0.04354631077260597
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.24358974358974358,
"acc_stderr": 0.028120966503914404,
"acc_norm": 0.24358974358974358,
"acc_norm_stderr": 0.028120966503914404
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2848020434227331,
"acc_stderr": 0.01613917409652258,
"acc_norm": 0.2848020434227331,
"acc_norm_stderr": 0.01613917409652258
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.23699421965317918,
"acc_stderr": 0.022894082489925995,
"acc_norm": 0.23699421965317918,
"acc_norm_stderr": 0.022894082489925995
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.025261691219729494,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.025261691219729494
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2797427652733119,
"acc_stderr": 0.025494259350694902,
"acc_norm": 0.2797427652733119,
"acc_norm_stderr": 0.025494259350694902
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2716049382716049,
"acc_stderr": 0.02474862449053737,
"acc_norm": 0.2716049382716049,
"acc_norm_stderr": 0.02474862449053737
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2978723404255319,
"acc_stderr": 0.027281608344469414,
"acc_norm": 0.2978723404255319,
"acc_norm_stderr": 0.027281608344469414
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2438070404172099,
"acc_stderr": 0.01096650797217848,
"acc_norm": 0.2438070404172099,
"acc_norm_stderr": 0.01096650797217848
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.39338235294117646,
"acc_stderr": 0.02967428828131118,
"acc_norm": 0.39338235294117646,
"acc_norm_stderr": 0.02967428828131118
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.238562091503268,
"acc_stderr": 0.017242385828779606,
"acc_norm": 0.238562091503268,
"acc_norm_stderr": 0.017242385828779606
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.36363636363636365,
"acc_stderr": 0.04607582090719976,
"acc_norm": 0.36363636363636365,
"acc_norm_stderr": 0.04607582090719976
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2530612244897959,
"acc_stderr": 0.02783302387139969,
"acc_norm": 0.2530612244897959,
"acc_norm_stderr": 0.02783302387139969
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23880597014925373,
"acc_stderr": 0.030147775935409224,
"acc_norm": 0.23880597014925373,
"acc_norm_stderr": 0.030147775935409224
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-virology|5": {
"acc": 0.29518072289156627,
"acc_stderr": 0.03550920185689629,
"acc_norm": 0.29518072289156627,
"acc_norm_stderr": 0.03550920185689629
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2573099415204678,
"acc_stderr": 0.03352799844161865,
"acc_norm": 0.2573099415204678,
"acc_norm_stderr": 0.03352799844161865
},
"harness|truthfulqa:mc|0": {
"mc1": 0.25703794369645044,
"mc1_stderr": 0.01529807750948508,
"mc2": 0.4305451198786013,
"mc2_stderr": 0.014753843652404446
},
"harness|winogrande|5": {
"acc": 0.5114443567482242,
"acc_stderr": 0.014048804199859322
},
"harness|gsm8k|5": {
"acc": 0.017437452615617893,
"acc_stderr": 0.0036054868679982793
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_bigcode__starcoderbase-3b | [
"region:us"
] | 2024-02-14T22:13:44+00:00 | {"pretty_name": "Evaluation run of bigcode/starcoderbase-3b", "dataset_summary": "Dataset automatically created during the evaluation run of model [bigcode/starcoderbase-3b](https://huggingface.co/bigcode/starcoderbase-3b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_bigcode__starcoderbase-3b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-14T22:11:18.391995](https://huggingface.co/datasets/open-llm-leaderboard/details_bigcode__starcoderbase-3b/blob/main/results_2024-02-14T22-11-18.391995.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2733652033322023,\n \"acc_stderr\": 0.03160388918604069,\n \"acc_norm\": 0.27526586885613774,\n \"acc_norm_stderr\": 0.03238961456755636,\n \"mc1\": 0.25703794369645044,\n \"mc1_stderr\": 0.01529807750948508,\n \"mc2\": 0.4305451198786013,\n \"mc2_stderr\": 0.014753843652404446\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.2226962457337884,\n \"acc_stderr\": 0.01215831477482991,\n \"acc_norm\": 0.25853242320819114,\n \"acc_norm_stderr\": 0.012794553754288675\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.33260306711810395,\n \"acc_stderr\": 0.004701828071992653,\n \"acc_norm\": 0.3910575582553276,\n \"acc_norm_stderr\": 0.004869899297734546\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768081,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768081\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.28888888888888886,\n \"acc_stderr\": 0.0391545063041425,\n \"acc_norm\": 0.28888888888888886,\n \"acc_norm_stderr\": 0.0391545063041425\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.19736842105263158,\n \"acc_stderr\": 0.03238981601699397,\n \"acc_norm\": 0.19736842105263158,\n \"acc_norm_stderr\": 0.03238981601699397\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.29056603773584905,\n \"acc_stderr\": 0.02794321998933713,\n \"acc_norm\": 0.29056603773584905,\n \"acc_norm_stderr\": 0.02794321998933713\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3263888888888889,\n \"acc_stderr\": 0.03921067198982266,\n \"acc_norm\": 0.3263888888888889,\n \"acc_norm_stderr\": 0.03921067198982266\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542126,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542126\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23121387283236994,\n \"acc_stderr\": 0.0321473730202947,\n \"acc_norm\": 0.23121387283236994,\n \"acc_norm_stderr\": 0.0321473730202947\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.043364327079931785,\n \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.043364327079931785\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.33617021276595743,\n \"acc_stderr\": 0.030881618520676942,\n \"acc_norm\": 0.33617021276595743,\n \"acc_norm_stderr\": 0.030881618520676942\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n \"acc_stderr\": 0.04266339443159394,\n \"acc_norm\": 0.2894736842105263,\n \"acc_norm_stderr\": 0.04266339443159394\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.20689655172413793,\n \"acc_stderr\": 0.03375672449560553,\n \"acc_norm\": 0.20689655172413793,\n \"acc_norm_stderr\": 0.03375672449560553\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2698412698412698,\n \"acc_stderr\": 0.022860838309232072,\n \"acc_norm\": 0.2698412698412698,\n \"acc_norm_stderr\": 0.022860838309232072\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.037184890068181146,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.037184890068181146\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2838709677419355,\n \"acc_stderr\": 0.02564938106302928,\n \"acc_norm\": 0.2838709677419355,\n \"acc_norm_stderr\": 0.02564938106302928\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.030108330718011625,\n \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.030108330718011625\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.24848484848484848,\n \"acc_stderr\": 0.03374402644139404,\n \"acc_norm\": 0.24848484848484848,\n \"acc_norm_stderr\": 0.03374402644139404\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.2828282828282828,\n \"acc_stderr\": 0.03208779558786751,\n \"acc_norm\": 0.2828282828282828,\n \"acc_norm_stderr\": 0.03208779558786751\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.25906735751295334,\n \"acc_stderr\": 0.03161877917935411,\n \"acc_norm\": 0.25906735751295334,\n \"acc_norm_stderr\": 0.03161877917935411\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.29743589743589743,\n \"acc_stderr\": 0.023177408131465942,\n \"acc_norm\": 0.29743589743589743,\n \"acc_norm_stderr\": 0.023177408131465942\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.026719240783712177,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.026719240783712177\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.33613445378151263,\n \"acc_stderr\": 0.030684737115135356,\n \"acc_norm\": 0.33613445378151263,\n \"acc_norm_stderr\": 0.030684737115135356\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.25504587155963304,\n \"acc_stderr\": 0.01868850085653584,\n \"acc_norm\": 0.25504587155963304,\n \"acc_norm_stderr\": 0.01868850085653584\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4027777777777778,\n \"acc_stderr\": 0.03344887382997866,\n \"acc_norm\": 0.4027777777777778,\n \"acc_norm_stderr\": 0.03344887382997866\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.027865942286639325,\n \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.027865942286639325\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.26582278481012656,\n \"acc_stderr\": 0.028756799629658342,\n \"acc_norm\": 0.26582278481012656,\n \"acc_norm_stderr\": 0.028756799629658342\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3632286995515695,\n \"acc_stderr\": 0.032277904428505,\n \"acc_norm\": 0.3632286995515695,\n \"acc_norm_stderr\": 0.032277904428505\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2066115702479339,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.2066115702479339,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2037037037037037,\n \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.2037037037037037,\n \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.25153374233128833,\n \"acc_stderr\": 0.034089978868575295,\n \"acc_norm\": 0.25153374233128833,\n \"acc_norm_stderr\": 0.034089978868575295\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.042878587513404544,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.042878587513404544\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.2621359223300971,\n \"acc_stderr\": 0.04354631077260597,\n \"acc_norm\": 0.2621359223300971,\n \"acc_norm_stderr\": 0.04354631077260597\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.24358974358974358,\n \"acc_stderr\": 0.028120966503914404,\n \"acc_norm\": 0.24358974358974358,\n \"acc_norm_stderr\": 0.028120966503914404\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2848020434227331,\n \"acc_stderr\": 0.01613917409652258,\n \"acc_norm\": 0.2848020434227331,\n \"acc_norm_stderr\": 0.01613917409652258\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.23699421965317918,\n \"acc_stderr\": 0.022894082489925995,\n \"acc_norm\": 0.23699421965317918,\n \"acc_norm_stderr\": 0.022894082489925995\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.025261691219729494,\n \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.025261691219729494\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2797427652733119,\n \"acc_stderr\": 0.025494259350694902,\n \"acc_norm\": 0.2797427652733119,\n \"acc_norm_stderr\": 0.025494259350694902\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.2716049382716049,\n \"acc_stderr\": 0.02474862449053737,\n \"acc_norm\": 0.2716049382716049,\n \"acc_norm_stderr\": 0.02474862449053737\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.2978723404255319,\n \"acc_stderr\": 0.027281608344469414,\n \"acc_norm\": 0.2978723404255319,\n \"acc_norm_stderr\": 0.027281608344469414\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2438070404172099,\n \"acc_stderr\": 0.01096650797217848,\n \"acc_norm\": 0.2438070404172099,\n \"acc_norm_stderr\": 0.01096650797217848\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.39338235294117646,\n \"acc_stderr\": 0.02967428828131118,\n \"acc_norm\": 0.39338235294117646,\n \"acc_norm_stderr\": 0.02967428828131118\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.238562091503268,\n \"acc_stderr\": 0.017242385828779606,\n \"acc_norm\": 0.238562091503268,\n \"acc_norm_stderr\": 0.017242385828779606\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.36363636363636365,\n \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.36363636363636365,\n \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.2530612244897959,\n \"acc_stderr\": 0.02783302387139969,\n \"acc_norm\": 0.2530612244897959,\n \"acc_norm_stderr\": 0.02783302387139969\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23880597014925373,\n \"acc_stderr\": 0.030147775935409224,\n \"acc_norm\": 0.23880597014925373,\n \"acc_norm_stderr\": 0.030147775935409224\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.29518072289156627,\n \"acc_stderr\": 0.03550920185689629,\n \"acc_norm\": 0.29518072289156627,\n \"acc_norm_stderr\": 0.03550920185689629\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.2573099415204678,\n \"acc_stderr\": 0.03352799844161865,\n \"acc_norm\": 0.2573099415204678,\n \"acc_norm_stderr\": 0.03352799844161865\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.25703794369645044,\n \"mc1_stderr\": 0.01529807750948508,\n \"mc2\": 0.4305451198786013,\n \"mc2_stderr\": 0.014753843652404446\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5114443567482242,\n \"acc_stderr\": 0.014048804199859322\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.017437452615617893,\n \"acc_stderr\": 0.0036054868679982793\n }\n}\n```", "repo_url": "https://huggingface.co/bigcode/starcoderbase-3b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_14T22_11_18.391995", "path": ["**/details_harness|arc:challenge|25_2024-02-14T22-11-18.391995.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-14T22-11-18.391995.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_14T22_11_18.391995", "path": ["**/details_harness|gsm8k|5_2024-02-14T22-11-18.391995.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-14T22-11-18.391995.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_14T22_11_18.391995", "path": ["**/details_harness|hellaswag|10_2024-02-14T22-11-18.391995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-14T22-11-18.391995.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_14T22_11_18.391995", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T22-11-18.391995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T22-11-18.391995.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T22-11-18.391995.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_14T22_11_18.391995", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T22-11-18.391995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T22-11-18.391995.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_14T22_11_18.391995", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T22-11-18.391995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T22-11-18.391995.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_14T22_11_18.391995", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T22-11-18.391995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T22-11-18.391995.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_14T22_11_18.391995", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T22-11-18.391995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T22-11-18.391995.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_14T22_11_18.391995", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T22-11-18.391995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T22-11-18.391995.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_14T22_11_18.391995", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T22-11-18.391995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T22-11-18.391995.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_14T22_11_18.391995", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T22-11-18.391995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T22-11-18.391995.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_14T22_11_18.391995", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T22-11-18.391995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T22-11-18.391995.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_14T22_11_18.391995", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T22-11-18.391995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T22-11-18.391995.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_14T22_11_18.391995", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T22-11-18.391995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T22-11-18.391995.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_14T22_11_18.391995", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T22-11-18.391995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T22-11-18.391995.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_14T22_11_18.391995", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T22-11-18.391995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T22-11-18.391995.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_14T22_11_18.391995", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T22-11-18.391995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T22-11-18.391995.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_14T22_11_18.391995", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T22-11-18.391995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T22-11-18.391995.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_14T22_11_18.391995", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T22-11-18.391995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T22-11-18.391995.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_14T22_11_18.391995", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T22-11-18.391995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T22-11-18.391995.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_14T22_11_18.391995", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T22-11-18.391995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T22-11-18.391995.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_14T22_11_18.391995", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T22-11-18.391995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T22-11-18.391995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_14T22_11_18.391995", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T22-11-18.391995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T22-11-18.391995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_14T22_11_18.391995", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T22-11-18.391995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T22-11-18.391995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_14T22_11_18.391995", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T22-11-18.391995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T22-11-18.391995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_14T22_11_18.391995", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T22-11-18.391995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T22-11-18.391995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_14T22_11_18.391995", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T22-11-18.391995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T22-11-18.391995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_14T22_11_18.391995", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T22-11-18.391995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T22-11-18.391995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_14T22_11_18.391995", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T22-11-18.391995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T22-11-18.391995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_14T22_11_18.391995", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T22-11-18.391995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T22-11-18.391995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_14T22_11_18.391995", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T22-11-18.391995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T22-11-18.391995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_14T22_11_18.391995", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T22-11-18.391995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T22-11-18.391995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_14T22_11_18.391995", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T22-11-18.391995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T22-11-18.391995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_14T22_11_18.391995", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T22-11-18.391995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T22-11-18.391995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_14T22_11_18.391995", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T22-11-18.391995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T22-11-18.391995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_14T22_11_18.391995", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T22-11-18.391995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T22-11-18.391995.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_14T22_11_18.391995", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T22-11-18.391995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T22-11-18.391995.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_14T22_11_18.391995", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T22-11-18.391995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T22-11-18.391995.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_14T22_11_18.391995", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T22-11-18.391995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T22-11-18.391995.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_14T22_11_18.391995", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T22-11-18.391995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T22-11-18.391995.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_14T22_11_18.391995", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T22-11-18.391995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T22-11-18.391995.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_14T22_11_18.391995", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T22-11-18.391995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T22-11-18.391995.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_14T22_11_18.391995", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T22-11-18.391995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T22-11-18.391995.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_14T22_11_18.391995", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T22-11-18.391995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T22-11-18.391995.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_14T22_11_18.391995", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T22-11-18.391995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T22-11-18.391995.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_14T22_11_18.391995", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T22-11-18.391995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T22-11-18.391995.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_14T22_11_18.391995", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T22-11-18.391995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T22-11-18.391995.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_14T22_11_18.391995", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T22-11-18.391995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T22-11-18.391995.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_14T22_11_18.391995", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T22-11-18.391995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T22-11-18.391995.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_14T22_11_18.391995", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T22-11-18.391995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T22-11-18.391995.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_14T22_11_18.391995", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T22-11-18.391995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T22-11-18.391995.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_14T22_11_18.391995", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T22-11-18.391995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T22-11-18.391995.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_14T22_11_18.391995", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T22-11-18.391995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T22-11-18.391995.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_14T22_11_18.391995", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T22-11-18.391995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T22-11-18.391995.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_14T22_11_18.391995", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T22-11-18.391995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T22-11-18.391995.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_14T22_11_18.391995", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T22-11-18.391995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T22-11-18.391995.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_14T22_11_18.391995", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T22-11-18.391995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T22-11-18.391995.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_14T22_11_18.391995", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T22-11-18.391995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T22-11-18.391995.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_14T22_11_18.391995", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T22-11-18.391995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T22-11-18.391995.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_14T22_11_18.391995", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T22-11-18.391995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T22-11-18.391995.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_14T22_11_18.391995", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T22-11-18.391995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T22-11-18.391995.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_14T22_11_18.391995", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T22-11-18.391995.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T22-11-18.391995.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_14T22_11_18.391995", "path": ["**/details_harness|winogrande|5_2024-02-14T22-11-18.391995.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-14T22-11-18.391995.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_14T22_11_18.391995", "path": ["results_2024-02-14T22-11-18.391995.parquet"]}, {"split": "latest", "path": ["results_2024-02-14T22-11-18.391995.parquet"]}]}]} | 2024-02-14T22:14:09+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of bigcode/starcoderbase-3b
Dataset automatically created during the evaluation run of model bigcode/starcoderbase-3b on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-14T22:11:18.391995(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of bigcode/starcoderbase-3b\n\n\n\nDataset automatically created during the evaluation run of model bigcode/starcoderbase-3b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T22:11:18.391995(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of bigcode/starcoderbase-3b\n\n\n\nDataset automatically created during the evaluation run of model bigcode/starcoderbase-3b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T22:11:18.391995(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
cc4e6ffec0a5ec39e427e0560207a3f970036b4b | each reaction is designated with three difference enzymes | dzjxzyd/rhea_uniprot_reaction_large | [
"license:apache-2.0",
"region:us"
] | 2024-02-14T22:25:57+00:00 | {"license": "apache-2.0"} | 2024-02-16T16:48:12+00:00 | [] | [] | TAGS
#license-apache-2.0 #region-us
| each reaction is designated with three difference enzymes | [] | [
"TAGS\n#license-apache-2.0 #region-us \n"
] |
04f75f77418b1ede55c10cdc6933c8372a9211be |
# Dataset Card for Evaluation run of bigcode/starcoderbase-7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [bigcode/starcoderbase-7b](https://huggingface.co/bigcode/starcoderbase-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_bigcode__starcoderbase-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-14T22:30:37.851656](https://huggingface.co/datasets/open-llm-leaderboard/details_bigcode__starcoderbase-7b/blob/main/results_2024-02-14T22-30-37.851656.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2855544717164793,
"acc_stderr": 0.032025544877512004,
"acc_norm": 0.28731800624157283,
"acc_norm_stderr": 0.03278279025567369,
"mc1": 0.25091799265605874,
"mc1_stderr": 0.015176985027707696,
"mc2": 0.4046263361255611,
"mc2_stderr": 0.014888506723649383
},
"harness|arc:challenge|25": {
"acc": 0.2508532423208191,
"acc_stderr": 0.012668198621315435,
"acc_norm": 0.2986348122866894,
"acc_norm_stderr": 0.013374078615068756
},
"harness|hellaswag|10": {
"acc": 0.3551085441147182,
"acc_stderr": 0.004775681871529863,
"acc_norm": 0.4386576379207329,
"acc_norm_stderr": 0.004952087083128893
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.03459777606810537,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.03459777606810537
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.33962264150943394,
"acc_stderr": 0.02914690474779833,
"acc_norm": 0.33962264150943394,
"acc_norm_stderr": 0.02914690474779833
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3194444444444444,
"acc_stderr": 0.03899073687357336,
"acc_norm": 0.3194444444444444,
"acc_norm_stderr": 0.03899073687357336
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036623,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036623
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.1907514450867052,
"acc_stderr": 0.02995785132986934,
"acc_norm": 0.1907514450867052,
"acc_norm_stderr": 0.02995785132986934
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.04158307533083286,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.04158307533083286
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.31063829787234043,
"acc_stderr": 0.03025123757921317,
"acc_norm": 0.31063829787234043,
"acc_norm_stderr": 0.03025123757921317
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.041424397194893596,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.041424397194893596
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.30344827586206896,
"acc_stderr": 0.038312260488503336,
"acc_norm": 0.30344827586206896,
"acc_norm_stderr": 0.038312260488503336
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.022569897074918417,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.022569897074918417
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.20634920634920634,
"acc_stderr": 0.036196045241242515,
"acc_norm": 0.20634920634920634,
"acc_norm_stderr": 0.036196045241242515
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.23870967741935484,
"acc_stderr": 0.024251071262208834,
"acc_norm": 0.23870967741935484,
"acc_norm_stderr": 0.024251071262208834
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03010833071801162,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03010833071801162
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.23636363636363636,
"acc_stderr": 0.033175059300091805,
"acc_norm": 0.23636363636363636,
"acc_norm_stderr": 0.033175059300091805
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.03191178226713549,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.03191178226713549
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.27461139896373055,
"acc_stderr": 0.03221024508041154,
"acc_norm": 0.27461139896373055,
"acc_norm_stderr": 0.03221024508041154
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2512820512820513,
"acc_stderr": 0.02199201666237056,
"acc_norm": 0.2512820512820513,
"acc_norm_stderr": 0.02199201666237056
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.27037037037037037,
"acc_stderr": 0.02708037281514567,
"acc_norm": 0.27037037037037037,
"acc_norm_stderr": 0.02708037281514567
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.028657491285071973,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.028657491285071973
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.25165562913907286,
"acc_stderr": 0.03543304234389985,
"acc_norm": 0.25165562913907286,
"acc_norm_stderr": 0.03543304234389985
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.27339449541284405,
"acc_stderr": 0.019109299846098278,
"acc_norm": 0.27339449541284405,
"acc_norm_stderr": 0.019109299846098278
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.028353212866863434,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.028353212866863434
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.3137254901960784,
"acc_stderr": 0.03256685484460388,
"acc_norm": 0.3137254901960784,
"acc_norm_stderr": 0.03256685484460388
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2911392405063291,
"acc_stderr": 0.029571601065753374,
"acc_norm": 0.2911392405063291,
"acc_norm_stderr": 0.029571601065753374
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.38565022421524664,
"acc_stderr": 0.03266842214289202,
"acc_norm": 0.38565022421524664,
"acc_norm_stderr": 0.03266842214289202
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.29770992366412213,
"acc_stderr": 0.04010358942462203,
"acc_norm": 0.29770992366412213,
"acc_norm_stderr": 0.04010358942462203
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.38016528925619836,
"acc_stderr": 0.04431324501968431,
"acc_norm": 0.38016528925619836,
"acc_norm_stderr": 0.04431324501968431
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.39814814814814814,
"acc_stderr": 0.047323326159788154,
"acc_norm": 0.39814814814814814,
"acc_norm_stderr": 0.047323326159788154
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22699386503067484,
"acc_stderr": 0.032910995786157686,
"acc_norm": 0.22699386503067484,
"acc_norm_stderr": 0.032910995786157686
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04287858751340455,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04287858751340455
},
"harness|hendrycksTest-management|5": {
"acc": 0.27184466019417475,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.27184466019417475,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.36752136752136755,
"acc_stderr": 0.03158539157745636,
"acc_norm": 0.36752136752136755,
"acc_norm_stderr": 0.03158539157745636
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.21,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.21,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.3052362707535121,
"acc_stderr": 0.016467711947635123,
"acc_norm": 0.3052362707535121,
"acc_norm_stderr": 0.016467711947635123
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.30346820809248554,
"acc_stderr": 0.024752411960917212,
"acc_norm": 0.30346820809248554,
"acc_norm_stderr": 0.024752411960917212
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24134078212290502,
"acc_stderr": 0.014310999547961459,
"acc_norm": 0.24134078212290502,
"acc_norm_stderr": 0.014310999547961459
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2581699346405229,
"acc_stderr": 0.025058503316958157,
"acc_norm": 0.2581699346405229,
"acc_norm_stderr": 0.025058503316958157
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3086816720257235,
"acc_stderr": 0.026236965881153256,
"acc_norm": 0.3086816720257235,
"acc_norm_stderr": 0.026236965881153256
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2839506172839506,
"acc_stderr": 0.02508947852376513,
"acc_norm": 0.2839506172839506,
"acc_norm_stderr": 0.02508947852376513
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.30851063829787234,
"acc_stderr": 0.027553366165101362,
"acc_norm": 0.30851063829787234,
"acc_norm_stderr": 0.027553366165101362
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2796610169491525,
"acc_stderr": 0.011463397393861974,
"acc_norm": 0.2796610169491525,
"acc_norm_stderr": 0.011463397393861974
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.23161764705882354,
"acc_stderr": 0.025626533803777565,
"acc_norm": 0.23161764705882354,
"acc_norm_stderr": 0.025626533803777565
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2630718954248366,
"acc_stderr": 0.017812676542320657,
"acc_norm": 0.2630718954248366,
"acc_norm_stderr": 0.017812676542320657
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2818181818181818,
"acc_stderr": 0.04309118709946459,
"acc_norm": 0.2818181818181818,
"acc_norm_stderr": 0.04309118709946459
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.33877551020408164,
"acc_stderr": 0.030299506562154185,
"acc_norm": 0.33877551020408164,
"acc_norm_stderr": 0.030299506562154185
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.31840796019900497,
"acc_stderr": 0.03294118479054095,
"acc_norm": 0.31840796019900497,
"acc_norm_stderr": 0.03294118479054095
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3192771084337349,
"acc_stderr": 0.03629335329947861,
"acc_norm": 0.3192771084337349,
"acc_norm_stderr": 0.03629335329947861
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.03508771929824564,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.03508771929824564
},
"harness|truthfulqa:mc|0": {
"mc1": 0.25091799265605874,
"mc1_stderr": 0.015176985027707696,
"mc2": 0.4046263361255611,
"mc2_stderr": 0.014888506723649383
},
"harness|winogrande|5": {
"acc": 0.5438042620363063,
"acc_stderr": 0.013998453610924324
},
"harness|gsm8k|5": {
"acc": 0.05458680818802123,
"acc_stderr": 0.006257444037912551
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_bigcode__starcoderbase-7b | [
"region:us"
] | 2024-02-14T22:33:01+00:00 | {"pretty_name": "Evaluation run of bigcode/starcoderbase-7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [bigcode/starcoderbase-7b](https://huggingface.co/bigcode/starcoderbase-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_bigcode__starcoderbase-7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-14T22:30:37.851656](https://huggingface.co/datasets/open-llm-leaderboard/details_bigcode__starcoderbase-7b/blob/main/results_2024-02-14T22-30-37.851656.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2855544717164793,\n \"acc_stderr\": 0.032025544877512004,\n \"acc_norm\": 0.28731800624157283,\n \"acc_norm_stderr\": 0.03278279025567369,\n \"mc1\": 0.25091799265605874,\n \"mc1_stderr\": 0.015176985027707696,\n \"mc2\": 0.4046263361255611,\n \"mc2_stderr\": 0.014888506723649383\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.2508532423208191,\n \"acc_stderr\": 0.012668198621315435,\n \"acc_norm\": 0.2986348122866894,\n \"acc_norm_stderr\": 0.013374078615068756\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.3551085441147182,\n \"acc_stderr\": 0.004775681871529863,\n \"acc_norm\": 0.4386576379207329,\n \"acc_norm_stderr\": 0.004952087083128893\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.34814814814814815,\n \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.23684210526315788,\n \"acc_stderr\": 0.03459777606810537,\n \"acc_norm\": 0.23684210526315788,\n \"acc_norm_stderr\": 0.03459777606810537\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.33962264150943394,\n \"acc_stderr\": 0.02914690474779833,\n \"acc_norm\": 0.33962264150943394,\n \"acc_norm_stderr\": 0.02914690474779833\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3194444444444444,\n \"acc_stderr\": 0.03899073687357336,\n \"acc_norm\": 0.3194444444444444,\n \"acc_norm_stderr\": 0.03899073687357336\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036623,\n \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036623\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.1907514450867052,\n \"acc_stderr\": 0.02995785132986934,\n \"acc_norm\": 0.1907514450867052,\n \"acc_norm_stderr\": 0.02995785132986934\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.04158307533083286,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.04158307533083286\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.31063829787234043,\n \"acc_stderr\": 0.03025123757921317,\n \"acc_norm\": 0.31063829787234043,\n \"acc_norm_stderr\": 0.03025123757921317\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n \"acc_stderr\": 0.041424397194893596,\n \"acc_norm\": 0.2631578947368421,\n \"acc_norm_stderr\": 0.041424397194893596\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.30344827586206896,\n \"acc_stderr\": 0.038312260488503336,\n \"acc_norm\": 0.30344827586206896,\n \"acc_norm_stderr\": 0.038312260488503336\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.022569897074918417,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.022569897074918417\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.20634920634920634,\n \"acc_stderr\": 0.036196045241242515,\n \"acc_norm\": 0.20634920634920634,\n \"acc_norm_stderr\": 0.036196045241242515\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.23870967741935484,\n \"acc_stderr\": 0.024251071262208834,\n \"acc_norm\": 0.23870967741935484,\n \"acc_norm_stderr\": 0.024251071262208834\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03010833071801162,\n \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03010833071801162\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.23636363636363636,\n \"acc_stderr\": 0.033175059300091805,\n \"acc_norm\": 0.23636363636363636,\n \"acc_norm_stderr\": 0.033175059300091805\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.03191178226713549,\n \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.03191178226713549\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.27461139896373055,\n \"acc_stderr\": 0.03221024508041154,\n \"acc_norm\": 0.27461139896373055,\n \"acc_norm_stderr\": 0.03221024508041154\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.2512820512820513,\n \"acc_stderr\": 0.02199201666237056,\n \"acc_norm\": 0.2512820512820513,\n \"acc_norm_stderr\": 0.02199201666237056\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.27037037037037037,\n \"acc_stderr\": 0.02708037281514567,\n \"acc_norm\": 0.27037037037037037,\n \"acc_norm_stderr\": 0.02708037281514567\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.028657491285071973,\n \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.028657491285071973\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.25165562913907286,\n \"acc_stderr\": 0.03543304234389985,\n \"acc_norm\": 0.25165562913907286,\n \"acc_norm_stderr\": 0.03543304234389985\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.27339449541284405,\n \"acc_stderr\": 0.019109299846098278,\n \"acc_norm\": 0.27339449541284405,\n \"acc_norm_stderr\": 0.019109299846098278\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.028353212866863434,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.028353212866863434\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.03256685484460388,\n \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.03256685484460388\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.2911392405063291,\n \"acc_stderr\": 0.029571601065753374,\n \"acc_norm\": 0.2911392405063291,\n \"acc_norm_stderr\": 0.029571601065753374\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.38565022421524664,\n \"acc_stderr\": 0.03266842214289202,\n \"acc_norm\": 0.38565022421524664,\n \"acc_norm_stderr\": 0.03266842214289202\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.29770992366412213,\n \"acc_stderr\": 0.04010358942462203,\n \"acc_norm\": 0.29770992366412213,\n \"acc_norm_stderr\": 0.04010358942462203\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.38016528925619836,\n \"acc_stderr\": 0.04431324501968431,\n \"acc_norm\": 0.38016528925619836,\n \"acc_norm_stderr\": 0.04431324501968431\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.39814814814814814,\n \"acc_stderr\": 0.047323326159788154,\n \"acc_norm\": 0.39814814814814814,\n \"acc_norm_stderr\": 0.047323326159788154\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.22699386503067484,\n \"acc_stderr\": 0.032910995786157686,\n \"acc_norm\": 0.22699386503067484,\n \"acc_norm_stderr\": 0.032910995786157686\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.04287858751340455,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.04287858751340455\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.27184466019417475,\n \"acc_stderr\": 0.044052680241409216,\n \"acc_norm\": 0.27184466019417475,\n \"acc_norm_stderr\": 0.044052680241409216\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.36752136752136755,\n \"acc_stderr\": 0.03158539157745636,\n \"acc_norm\": 0.36752136752136755,\n \"acc_norm_stderr\": 0.03158539157745636\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.3052362707535121,\n \"acc_stderr\": 0.016467711947635123,\n \"acc_norm\": 0.3052362707535121,\n \"acc_norm_stderr\": 0.016467711947635123\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.30346820809248554,\n \"acc_stderr\": 0.024752411960917212,\n \"acc_norm\": 0.30346820809248554,\n \"acc_norm_stderr\": 0.024752411960917212\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24134078212290502,\n \"acc_stderr\": 0.014310999547961459,\n \"acc_norm\": 0.24134078212290502,\n \"acc_norm_stderr\": 0.014310999547961459\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.2581699346405229,\n \"acc_stderr\": 0.025058503316958157,\n \"acc_norm\": 0.2581699346405229,\n \"acc_norm_stderr\": 0.025058503316958157\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3086816720257235,\n \"acc_stderr\": 0.026236965881153256,\n \"acc_norm\": 0.3086816720257235,\n \"acc_norm_stderr\": 0.026236965881153256\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.2839506172839506,\n \"acc_stderr\": 0.02508947852376513,\n \"acc_norm\": 0.2839506172839506,\n \"acc_norm_stderr\": 0.02508947852376513\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.30851063829787234,\n \"acc_stderr\": 0.027553366165101362,\n \"acc_norm\": 0.30851063829787234,\n \"acc_norm_stderr\": 0.027553366165101362\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2796610169491525,\n \"acc_stderr\": 0.011463397393861974,\n \"acc_norm\": 0.2796610169491525,\n \"acc_norm_stderr\": 0.011463397393861974\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.23161764705882354,\n \"acc_stderr\": 0.025626533803777565,\n \"acc_norm\": 0.23161764705882354,\n \"acc_norm_stderr\": 0.025626533803777565\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.2630718954248366,\n \"acc_stderr\": 0.017812676542320657,\n \"acc_norm\": 0.2630718954248366,\n \"acc_norm_stderr\": 0.017812676542320657\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2818181818181818,\n \"acc_stderr\": 0.04309118709946459,\n \"acc_norm\": 0.2818181818181818,\n \"acc_norm_stderr\": 0.04309118709946459\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.33877551020408164,\n \"acc_stderr\": 0.030299506562154185,\n \"acc_norm\": 0.33877551020408164,\n \"acc_norm_stderr\": 0.030299506562154185\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.31840796019900497,\n \"acc_stderr\": 0.03294118479054095,\n \"acc_norm\": 0.31840796019900497,\n \"acc_norm_stderr\": 0.03294118479054095\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3192771084337349,\n \"acc_stderr\": 0.03629335329947861,\n \"acc_norm\": 0.3192771084337349,\n \"acc_norm_stderr\": 0.03629335329947861\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.2982456140350877,\n \"acc_stderr\": 0.03508771929824564,\n \"acc_norm\": 0.2982456140350877,\n \"acc_norm_stderr\": 0.03508771929824564\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.25091799265605874,\n \"mc1_stderr\": 0.015176985027707696,\n \"mc2\": 0.4046263361255611,\n \"mc2_stderr\": 0.014888506723649383\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5438042620363063,\n \"acc_stderr\": 0.013998453610924324\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.05458680818802123,\n \"acc_stderr\": 0.006257444037912551\n }\n}\n```", "repo_url": "https://huggingface.co/bigcode/starcoderbase-7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_14T22_30_37.851656", "path": ["**/details_harness|arc:challenge|25_2024-02-14T22-30-37.851656.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-14T22-30-37.851656.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_14T22_30_37.851656", "path": ["**/details_harness|gsm8k|5_2024-02-14T22-30-37.851656.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-14T22-30-37.851656.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_14T22_30_37.851656", "path": ["**/details_harness|hellaswag|10_2024-02-14T22-30-37.851656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-14T22-30-37.851656.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_14T22_30_37.851656", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T22-30-37.851656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T22-30-37.851656.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T22-30-37.851656.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_14T22_30_37.851656", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T22-30-37.851656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T22-30-37.851656.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_14T22_30_37.851656", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T22-30-37.851656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T22-30-37.851656.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_14T22_30_37.851656", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T22-30-37.851656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T22-30-37.851656.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_14T22_30_37.851656", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T22-30-37.851656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T22-30-37.851656.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_14T22_30_37.851656", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T22-30-37.851656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T22-30-37.851656.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_14T22_30_37.851656", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T22-30-37.851656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T22-30-37.851656.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_14T22_30_37.851656", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T22-30-37.851656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T22-30-37.851656.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_14T22_30_37.851656", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T22-30-37.851656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T22-30-37.851656.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_14T22_30_37.851656", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T22-30-37.851656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T22-30-37.851656.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_14T22_30_37.851656", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T22-30-37.851656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T22-30-37.851656.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_14T22_30_37.851656", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T22-30-37.851656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T22-30-37.851656.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_14T22_30_37.851656", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T22-30-37.851656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T22-30-37.851656.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_14T22_30_37.851656", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T22-30-37.851656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T22-30-37.851656.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_14T22_30_37.851656", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T22-30-37.851656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T22-30-37.851656.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_14T22_30_37.851656", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T22-30-37.851656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T22-30-37.851656.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_14T22_30_37.851656", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T22-30-37.851656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T22-30-37.851656.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_14T22_30_37.851656", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T22-30-37.851656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T22-30-37.851656.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_14T22_30_37.851656", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T22-30-37.851656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T22-30-37.851656.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_14T22_30_37.851656", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T22-30-37.851656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T22-30-37.851656.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_14T22_30_37.851656", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T22-30-37.851656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T22-30-37.851656.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_14T22_30_37.851656", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T22-30-37.851656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T22-30-37.851656.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_14T22_30_37.851656", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T22-30-37.851656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T22-30-37.851656.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_14T22_30_37.851656", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T22-30-37.851656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T22-30-37.851656.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_14T22_30_37.851656", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T22-30-37.851656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T22-30-37.851656.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_14T22_30_37.851656", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T22-30-37.851656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T22-30-37.851656.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_14T22_30_37.851656", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T22-30-37.851656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T22-30-37.851656.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_14T22_30_37.851656", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T22-30-37.851656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T22-30-37.851656.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_14T22_30_37.851656", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T22-30-37.851656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T22-30-37.851656.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_14T22_30_37.851656", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T22-30-37.851656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T22-30-37.851656.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_14T22_30_37.851656", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T22-30-37.851656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T22-30-37.851656.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_14T22_30_37.851656", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T22-30-37.851656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T22-30-37.851656.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_14T22_30_37.851656", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T22-30-37.851656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T22-30-37.851656.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_14T22_30_37.851656", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T22-30-37.851656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T22-30-37.851656.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_14T22_30_37.851656", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T22-30-37.851656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T22-30-37.851656.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_14T22_30_37.851656", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T22-30-37.851656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T22-30-37.851656.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_14T22_30_37.851656", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T22-30-37.851656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T22-30-37.851656.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_14T22_30_37.851656", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T22-30-37.851656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T22-30-37.851656.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_14T22_30_37.851656", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T22-30-37.851656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T22-30-37.851656.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_14T22_30_37.851656", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T22-30-37.851656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T22-30-37.851656.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_14T22_30_37.851656", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T22-30-37.851656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T22-30-37.851656.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_14T22_30_37.851656", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T22-30-37.851656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T22-30-37.851656.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_14T22_30_37.851656", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T22-30-37.851656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T22-30-37.851656.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_14T22_30_37.851656", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T22-30-37.851656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T22-30-37.851656.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_14T22_30_37.851656", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T22-30-37.851656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T22-30-37.851656.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_14T22_30_37.851656", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T22-30-37.851656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T22-30-37.851656.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_14T22_30_37.851656", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T22-30-37.851656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T22-30-37.851656.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_14T22_30_37.851656", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T22-30-37.851656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T22-30-37.851656.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_14T22_30_37.851656", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T22-30-37.851656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T22-30-37.851656.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_14T22_30_37.851656", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T22-30-37.851656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T22-30-37.851656.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_14T22_30_37.851656", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T22-30-37.851656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T22-30-37.851656.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_14T22_30_37.851656", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T22-30-37.851656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T22-30-37.851656.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_14T22_30_37.851656", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T22-30-37.851656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T22-30-37.851656.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_14T22_30_37.851656", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T22-30-37.851656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T22-30-37.851656.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_14T22_30_37.851656", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T22-30-37.851656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T22-30-37.851656.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_14T22_30_37.851656", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T22-30-37.851656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T22-30-37.851656.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_14T22_30_37.851656", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T22-30-37.851656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T22-30-37.851656.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_14T22_30_37.851656", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T22-30-37.851656.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T22-30-37.851656.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_14T22_30_37.851656", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T22-30-37.851656.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T22-30-37.851656.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_14T22_30_37.851656", "path": ["**/details_harness|winogrande|5_2024-02-14T22-30-37.851656.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-14T22-30-37.851656.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_14T22_30_37.851656", "path": ["results_2024-02-14T22-30-37.851656.parquet"]}, {"split": "latest", "path": ["results_2024-02-14T22-30-37.851656.parquet"]}]}]} | 2024-02-14T22:33:23+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of bigcode/starcoderbase-7b
Dataset automatically created during the evaluation run of model bigcode/starcoderbase-7b on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-14T22:30:37.851656(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of bigcode/starcoderbase-7b\n\n\n\nDataset automatically created during the evaluation run of model bigcode/starcoderbase-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T22:30:37.851656(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of bigcode/starcoderbase-7b\n\n\n\nDataset automatically created during the evaluation run of model bigcode/starcoderbase-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T22:30:37.851656(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
8e6ad06c19659b0f7d48f25fd009f775c7d7c3b3 | 
# Description
This dataset was created by filtering the [aya_dataset](https://huggingface.co/datasets/CohereForAI/aya_dataset) by [CohereForAI](https://huggingface.co/datasets/CohereForAI/) for rows containing Turkish texts.
**Training split:** 4046 rows\
**Test split:** 250 rows | sayhan/aya_dataset_tur | [
"size_categories:1K<n<10K",
"language:tr",
"license:apache-2.0",
"region:us"
] | 2024-02-14T22:44:51+00:00 | {"language": ["tr"], "license": "apache-2.0", "size_categories": ["1K<n<10K"], "dataset_info": {"features": [{"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}, {"name": "language", "dtype": "string"}, {"name": "language_code", "dtype": "string"}, {"name": "annotation_type", "dtype": "string"}, {"name": "user_id", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 5092340.457808701, "num_examples": 4046}, {"name": "test", "num_bytes": 254601.14285714287, "num_examples": 250}], "download_size": 1200045, "dataset_size": 5346941.600665844}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2024-02-15T01:11:10+00:00 | [] | [
"tr"
] | TAGS
#size_categories-1K<n<10K #language-Turkish #license-apache-2.0 #region-us
| !image/png
# Description
This dataset was created by filtering the aya_dataset by CohereForAI for rows containing Turkish texts.
Training split: 4046 rows\
Test split: 250 rows | [
"# Description\nThis dataset was created by filtering the aya_dataset by CohereForAI for rows containing Turkish texts.\n\nTraining split: 4046 rows\\\nTest split: 250 rows"
] | [
"TAGS\n#size_categories-1K<n<10K #language-Turkish #license-apache-2.0 #region-us \n",
"# Description\nThis dataset was created by filtering the aya_dataset by CohereForAI for rows containing Turkish texts.\n\nTraining split: 4046 rows\\\nTest split: 250 rows"
] |
b057165fc1703b775c2f41b40794130eb4abc4e6 |
# OpenMath GSM8K Masked
We release a *masked* version of the [GSM8K](https://github.com/openai/grade-school-math) solutions.
This data can be used to aid synthetic generation of additional solutions for GSM8K dataset
as it is much less likely to lead to inconsistent reasoning compared to using
the original solutions directly.
This dataset was used to construct [OpenMathInstruct-1](https://huggingface.co/datasets/nvidia/OpenMathInstruct-1):
a math instruction tuning dataset with 1.8M problem-solution pairs
generated using permissively licensed [Mixtral-8x7B](https://huggingface.co/mistralai/Mixtral-8x7B-v0.1) model.
For details of how the masked solutions were created, see our [paper](https://arxiv.org/abs/2402.10176).
You can re-create this dataset or apply similar techniques to mask solutions for other datasets
by using our [open-sourced code](https://github.com/Kipok/NeMo-Skills).
## Citation
If you find our work useful, please consider citing us!
```bibtex
@article{toshniwal2024openmath,
title = {OpenMathInstruct-1: A 1.8 Million Math Instruction Tuning Dataset},
author = {Shubham Toshniwal and Ivan Moshkov and Sean Narenthiran and Daria Gitman and Fei Jia and Igor Gitman},
year = {2024},
journal = {arXiv preprint arXiv: Arxiv-2402.10176}
}
```
## License
The use of this dataset is governed by the [NVIDIA License](LICENSE) which permits commercial usage.
| nvidia/OpenMath-GSM8K-masked | [
"task_categories:question-answering",
"task_categories:text-generation",
"size_categories:1K<n<10K",
"language:en",
"license:other",
"math",
"nvidia",
"arxiv:2402.10176",
"region:us"
] | 2024-02-14T23:27:51+00:00 | {"language": ["en"], "license": "other", "size_categories": ["1K<n<10K"], "task_categories": ["question-answering", "text-generation"], "pretty_name": "OpenMath GSM8K Masked", "license_name": "nvidia-license", "tags": ["math", "nvidia"]} | 2024-02-16T02:08:31+00:00 | [
"2402.10176"
] | [
"en"
] | TAGS
#task_categories-question-answering #task_categories-text-generation #size_categories-1K<n<10K #language-English #license-other #math #nvidia #arxiv-2402.10176 #region-us
|
# OpenMath GSM8K Masked
We release a *masked* version of the GSM8K solutions.
This data can be used to aid synthetic generation of additional solutions for GSM8K dataset
as it is much less likely to lead to inconsistent reasoning compared to using
the original solutions directly.
This dataset was used to construct OpenMathInstruct-1:
a math instruction tuning dataset with 1.8M problem-solution pairs
generated using permissively licensed Mixtral-8x7B model.
For details of how the masked solutions were created, see our paper.
You can re-create this dataset or apply similar techniques to mask solutions for other datasets
by using our open-sourced code.
If you find our work useful, please consider citing us!
## License
The use of this dataset is governed by the NVIDIA License which permits commercial usage.
| [
"# OpenMath GSM8K Masked\n\nWe release a *masked* version of the GSM8K solutions.\nThis data can be used to aid synthetic generation of additional solutions for GSM8K dataset\nas it is much less likely to lead to inconsistent reasoning compared to using\nthe original solutions directly.\n\nThis dataset was used to construct OpenMathInstruct-1:\na math instruction tuning dataset with 1.8M problem-solution pairs\ngenerated using permissively licensed Mixtral-8x7B model.\n\nFor details of how the masked solutions were created, see our paper.\n\nYou can re-create this dataset or apply similar techniques to mask solutions for other datasets\nby using our open-sourced code.\n\nIf you find our work useful, please consider citing us!",
"## License\n\nThe use of this dataset is governed by the NVIDIA License which permits commercial usage."
] | [
"TAGS\n#task_categories-question-answering #task_categories-text-generation #size_categories-1K<n<10K #language-English #license-other #math #nvidia #arxiv-2402.10176 #region-us \n",
"# OpenMath GSM8K Masked\n\nWe release a *masked* version of the GSM8K solutions.\nThis data can be used to aid synthetic generation of additional solutions for GSM8K dataset\nas it is much less likely to lead to inconsistent reasoning compared to using\nthe original solutions directly.\n\nThis dataset was used to construct OpenMathInstruct-1:\na math instruction tuning dataset with 1.8M problem-solution pairs\ngenerated using permissively licensed Mixtral-8x7B model.\n\nFor details of how the masked solutions were created, see our paper.\n\nYou can re-create this dataset or apply similar techniques to mask solutions for other datasets\nby using our open-sourced code.\n\nIf you find our work useful, please consider citing us!",
"## License\n\nThe use of this dataset is governed by the NVIDIA License which permits commercial usage."
] |
27e59c95ef77b09eefee1ec6535024e20fe30e96 |
# OpenMath GSM8K Masked
We release a *masked* version of the [MATH](https://github.com/hendrycks/math) solutions.
This data can be used to aid synthetic generation of additional solutions for MATH dataset
as it is much less likely to lead to inconsistent reasoning compared to using
the original solutions directly.
This dataset was used to construct [OpenMathInstruct-1](https://huggingface.co/datasets/nvidia/OpenMathInstruct-1):
a math instruction tuning dataset with 1.8M problem-solution pairs
generated using permissively licensed [Mixtral-8x7B](https://huggingface.co/mistralai/Mixtral-8x7B-v0.1) model.
For details of how the masked solutions were created, see our [paper](https://arxiv.org/abs/2402.10176).
You can re-create this dataset or apply similar techniques to mask solutions for other datasets
by using our [open-sourced code](https://github.com/Kipok/NeMo-Skills).
## Citation
If you find our work useful, please consider citing us!
```bibtex
@article{toshniwal2024openmath,
title = {OpenMathInstruct-1: A 1.8 Million Math Instruction Tuning Dataset},
author = {Shubham Toshniwal and Ivan Moshkov and Sean Narenthiran and Daria Gitman and Fei Jia and Igor Gitman},
year = {2024},
journal = {arXiv preprint arXiv: Arxiv-2402.10176}
}
```
## License
The use of this dataset is governed by the [NVIDIA License](LICENSE) which permits commercial usage.
| nvidia/OpenMath-MATH-masked | [
"task_categories:question-answering",
"task_categories:text-generation",
"size_categories:1K<n<10K",
"language:en",
"license:other",
"math",
"nvidia",
"arxiv:2402.10176",
"region:us"
] | 2024-02-14T23:28:13+00:00 | {"language": ["en"], "license": "other", "size_categories": ["1K<n<10K"], "task_categories": ["question-answering", "text-generation"], "pretty_name": "OpenMath MATH Masked", "license_name": "nvidia-license", "tags": ["math", "nvidia"]} | 2024-02-16T02:08:38+00:00 | [
"2402.10176"
] | [
"en"
] | TAGS
#task_categories-question-answering #task_categories-text-generation #size_categories-1K<n<10K #language-English #license-other #math #nvidia #arxiv-2402.10176 #region-us
|
# OpenMath GSM8K Masked
We release a *masked* version of the MATH solutions.
This data can be used to aid synthetic generation of additional solutions for MATH dataset
as it is much less likely to lead to inconsistent reasoning compared to using
the original solutions directly.
This dataset was used to construct OpenMathInstruct-1:
a math instruction tuning dataset with 1.8M problem-solution pairs
generated using permissively licensed Mixtral-8x7B model.
For details of how the masked solutions were created, see our paper.
You can re-create this dataset or apply similar techniques to mask solutions for other datasets
by using our open-sourced code.
If you find our work useful, please consider citing us!
## License
The use of this dataset is governed by the NVIDIA License which permits commercial usage.
| [
"# OpenMath GSM8K Masked\n\nWe release a *masked* version of the MATH solutions.\nThis data can be used to aid synthetic generation of additional solutions for MATH dataset\nas it is much less likely to lead to inconsistent reasoning compared to using\nthe original solutions directly.\n\nThis dataset was used to construct OpenMathInstruct-1:\na math instruction tuning dataset with 1.8M problem-solution pairs\ngenerated using permissively licensed Mixtral-8x7B model.\n\nFor details of how the masked solutions were created, see our paper.\n\nYou can re-create this dataset or apply similar techniques to mask solutions for other datasets\nby using our open-sourced code.\n\nIf you find our work useful, please consider citing us!",
"## License\n\nThe use of this dataset is governed by the NVIDIA License which permits commercial usage."
] | [
"TAGS\n#task_categories-question-answering #task_categories-text-generation #size_categories-1K<n<10K #language-English #license-other #math #nvidia #arxiv-2402.10176 #region-us \n",
"# OpenMath GSM8K Masked\n\nWe release a *masked* version of the MATH solutions.\nThis data can be used to aid synthetic generation of additional solutions for MATH dataset\nas it is much less likely to lead to inconsistent reasoning compared to using\nthe original solutions directly.\n\nThis dataset was used to construct OpenMathInstruct-1:\na math instruction tuning dataset with 1.8M problem-solution pairs\ngenerated using permissively licensed Mixtral-8x7B model.\n\nFor details of how the masked solutions were created, see our paper.\n\nYou can re-create this dataset or apply similar techniques to mask solutions for other datasets\nby using our open-sourced code.\n\nIf you find our work useful, please consider citing us!",
"## License\n\nThe use of this dataset is governed by the NVIDIA License which permits commercial usage."
] |
dc1f05087dd489612765bc267175f6a9fe3ac721 |
# Overview
Cell2Sentence is a novel method for adapting large language models to single-cell transcriptomics.
We transform single-cell RNA sequencing data into sequences of gene names ordered by expression level, termed "cell sentences".
This dataset was constructed from the immune tissue dataset in [Domínguez et al.](https://www.science.org/doi/10.1126/science.abl5197),
and it was used to train the [Pythia-160m model](https://huggingface.co/EleutherAI/pythia-160m) capable of generating complete cells described in our paper.
Details about the Cell2Sentence transformation and preprocessing pipeline can be found in our paper and GitHub repo linked below.
GitHub: <https://github.com/vandijklab/cell2sentence-ft>
Paper: <https://www.biorxiv.org/content/10.1101/2023.09.11.557287v3>
Model Card: <https://huggingface.co/vandijklab/pythia-160m-c2s> | vandijklab/immune-c2s | [
"task_categories:text-generation",
"task_categories:question-answering",
"size_categories:100K<n<1M",
"language:en",
"license:cc-by-nc-nd-4.0",
"biology",
"pytorch",
"causal-lm",
"region:us"
] | 2024-02-14T23:30:43+00:00 | {"language": ["en"], "license": "cc-by-nc-nd-4.0", "size_categories": ["100K<n<1M"], "task_categories": ["text-generation", "question-answering"], "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}, {"split": "val", "path": "data/val-*"}]}], "dataset_info": {"features": [{"name": "input_ids", "dtype": "string"}, {"name": "cell_type", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 2314316937, "num_examples": 218732}, {"name": "test", "num_bytes": 288846799, "num_examples": 27388}, {"name": "val", "num_bytes": 289505418, "num_examples": 27382}], "download_size": 2322876358, "dataset_size": 2892669154}, "tags": ["biology", "pytorch", "causal-lm"]} | 2024-02-15T02:00:02+00:00 | [] | [
"en"
] | TAGS
#task_categories-text-generation #task_categories-question-answering #size_categories-100K<n<1M #language-English #license-cc-by-nc-nd-4.0 #biology #pytorch #causal-lm #region-us
|
# Overview
Cell2Sentence is a novel method for adapting large language models to single-cell transcriptomics.
We transform single-cell RNA sequencing data into sequences of gene names ordered by expression level, termed "cell sentences".
This dataset was constructed from the immune tissue dataset in Domínguez et al.,
and it was used to train the Pythia-160m model capable of generating complete cells described in our paper.
Details about the Cell2Sentence transformation and preprocessing pipeline can be found in our paper and GitHub repo linked below.
GitHub: <URL
Paper: <URL
Model Card: <URL | [
"# Overview\n\nCell2Sentence is a novel method for adapting large language models to single-cell transcriptomics. \nWe transform single-cell RNA sequencing data into sequences of gene names ordered by expression level, termed \"cell sentences\". \nThis dataset was constructed from the immune tissue dataset in Domínguez et al., \nand it was used to train the Pythia-160m model capable of generating complete cells described in our paper. \nDetails about the Cell2Sentence transformation and preprocessing pipeline can be found in our paper and GitHub repo linked below.\n\nGitHub: <URL \nPaper: <URL \nModel Card: <URL"
] | [
"TAGS\n#task_categories-text-generation #task_categories-question-answering #size_categories-100K<n<1M #language-English #license-cc-by-nc-nd-4.0 #biology #pytorch #causal-lm #region-us \n",
"# Overview\n\nCell2Sentence is a novel method for adapting large language models to single-cell transcriptomics. \nWe transform single-cell RNA sequencing data into sequences of gene names ordered by expression level, termed \"cell sentences\". \nThis dataset was constructed from the immune tissue dataset in Domínguez et al., \nand it was used to train the Pythia-160m model capable of generating complete cells described in our paper. \nDetails about the Cell2Sentence transformation and preprocessing pipeline can be found in our paper and GitHub repo linked below.\n\nGitHub: <URL \nPaper: <URL \nModel Card: <URL"
] |
e5455576f9717d5c0e6db7070c2a180c9580a10d |
# Dataset Card for Evaluation run of touqir/Cyrax-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [touqir/Cyrax-7B](https://huggingface.co/touqir/Cyrax-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_touqir__Cyrax-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-14T23:34:58.336806](https://huggingface.co/datasets/open-llm-leaderboard/details_touqir__Cyrax-7B/blob/main/results_2024-02-14T23-34-58.336806.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6527021919239829,
"acc_stderr": 0.03198108902065119,
"acc_norm": 0.6514575047354266,
"acc_norm_stderr": 0.032651586646568864,
"mc1": 0.6303549571603427,
"mc1_stderr": 0.01689818070697388,
"mc2": 0.7701130353030725,
"mc2_stderr": 0.01431543014964792
},
"harness|arc:challenge|25": {
"acc": 0.7133105802047781,
"acc_stderr": 0.013214986329274777,
"acc_norm": 0.7295221843003413,
"acc_norm_stderr": 0.012980954547659556
},
"harness|hellaswag|10": {
"acc": 0.7457677753435571,
"acc_stderr": 0.004345388614520019,
"acc_norm": 0.8818960366460864,
"acc_norm_stderr": 0.003220716126685038
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.02533120243894443,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.02533120243894443
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.49206349206349204,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.49206349206349204,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.02328766512726854,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.02328766512726854
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8080808080808081,
"acc_stderr": 0.028057791672989017,
"acc_norm": 0.8080808080808081,
"acc_norm_stderr": 0.028057791672989017
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.917098445595855,
"acc_stderr": 0.01989934131572178,
"acc_norm": 0.917098445595855,
"acc_norm_stderr": 0.01989934131572178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374303,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374303
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.025195658428931792,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.025195658428931792
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290916,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752598,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752598
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8275862068965517,
"acc_stderr": 0.013507943909371802,
"acc_norm": 0.8275862068965517,
"acc_norm_stderr": 0.013507943909371802
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7369942196531792,
"acc_stderr": 0.023703099525258172,
"acc_norm": 0.7369942196531792,
"acc_norm_stderr": 0.023703099525258172
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4446927374301676,
"acc_stderr": 0.01661988198817702,
"acc_norm": 0.4446927374301676,
"acc_norm_stderr": 0.01661988198817702
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.025646863097137897,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.025646863097137897
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.02567025924218893,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.02567025924218893
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.024477222856135114,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.024477222856135114
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46870925684485004,
"acc_stderr": 0.012745204626083136,
"acc_norm": 0.46870925684485004,
"acc_norm_stderr": 0.012745204626083136
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.02824568739146292,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.02824568739146292
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6683006535947712,
"acc_stderr": 0.019047485239360378,
"acc_norm": 0.6683006535947712,
"acc_norm_stderr": 0.019047485239360378
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784593,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784593
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699121,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699121
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160893,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160893
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6303549571603427,
"mc1_stderr": 0.01689818070697388,
"mc2": 0.7701130353030725,
"mc2_stderr": 0.01431543014964792
},
"harness|winogrande|5": {
"acc": 0.8389897395422258,
"acc_stderr": 0.01032971283278572
},
"harness|gsm8k|5": {
"acc": 0.6921910538286581,
"acc_stderr": 0.012714401009923649
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_touqir__Cyrax-7B | [
"region:us"
] | 2024-02-14T23:37:17+00:00 | {"pretty_name": "Evaluation run of touqir/Cyrax-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [touqir/Cyrax-7B](https://huggingface.co/touqir/Cyrax-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_touqir__Cyrax-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-14T23:34:58.336806](https://huggingface.co/datasets/open-llm-leaderboard/details_touqir__Cyrax-7B/blob/main/results_2024-02-14T23-34-58.336806.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6527021919239829,\n \"acc_stderr\": 0.03198108902065119,\n \"acc_norm\": 0.6514575047354266,\n \"acc_norm_stderr\": 0.032651586646568864,\n \"mc1\": 0.6303549571603427,\n \"mc1_stderr\": 0.01689818070697388,\n \"mc2\": 0.7701130353030725,\n \"mc2_stderr\": 0.01431543014964792\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7133105802047781,\n \"acc_stderr\": 0.013214986329274777,\n \"acc_norm\": 0.7295221843003413,\n \"acc_norm_stderr\": 0.012980954547659556\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7457677753435571,\n \"acc_stderr\": 0.004345388614520019,\n \"acc_norm\": 0.8818960366460864,\n \"acc_norm_stderr\": 0.003220716126685038\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41005291005291006,\n \"acc_stderr\": 0.02533120243894443,\n \"acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.02533120243894443\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.49206349206349204,\n \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.49206349206349204,\n \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n \"acc_stderr\": 0.02328766512726854,\n \"acc_norm\": 0.7870967741935484,\n \"acc_norm_stderr\": 0.02328766512726854\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8080808080808081,\n \"acc_stderr\": 0.028057791672989017,\n \"acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.028057791672989017\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.917098445595855,\n \"acc_stderr\": 0.01989934131572178,\n \"acc_norm\": 0.917098445595855,\n \"acc_norm_stderr\": 0.01989934131572178\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374303,\n \"acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374303\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931792,\n \"acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931792\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290916,\n \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290916\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752598,\n \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752598\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8275862068965517,\n \"acc_stderr\": 0.013507943909371802,\n \"acc_norm\": 0.8275862068965517,\n \"acc_norm_stderr\": 0.013507943909371802\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.023703099525258172,\n \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.023703099525258172\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4446927374301676,\n \"acc_stderr\": 0.01661988198817702,\n \"acc_norm\": 0.4446927374301676,\n \"acc_norm_stderr\": 0.01661988198817702\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.025646863097137897,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.025646863097137897\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n \"acc_stderr\": 0.02567025924218893,\n \"acc_norm\": 0.7138263665594855,\n \"acc_norm_stderr\": 0.02567025924218893\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135114,\n \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135114\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46870925684485004,\n \"acc_stderr\": 0.012745204626083136,\n \"acc_norm\": 0.46870925684485004,\n \"acc_norm_stderr\": 0.012745204626083136\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.02824568739146292,\n \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.02824568739146292\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6683006535947712,\n \"acc_stderr\": 0.019047485239360378,\n \"acc_norm\": 0.6683006535947712,\n \"acc_norm_stderr\": 0.019047485239360378\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.5602409638554217,\n \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6303549571603427,\n \"mc1_stderr\": 0.01689818070697388,\n \"mc2\": 0.7701130353030725,\n \"mc2_stderr\": 0.01431543014964792\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8389897395422258,\n \"acc_stderr\": 0.01032971283278572\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6921910538286581,\n \"acc_stderr\": 0.012714401009923649\n }\n}\n```", "repo_url": "https://huggingface.co/touqir/Cyrax-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_14T23_34_58.336806", "path": ["**/details_harness|arc:challenge|25_2024-02-14T23-34-58.336806.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-14T23-34-58.336806.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_14T23_34_58.336806", "path": ["**/details_harness|gsm8k|5_2024-02-14T23-34-58.336806.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-14T23-34-58.336806.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_14T23_34_58.336806", "path": ["**/details_harness|hellaswag|10_2024-02-14T23-34-58.336806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-14T23-34-58.336806.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_14T23_34_58.336806", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T23-34-58.336806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-14T23-34-58.336806.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-14T23-34-58.336806.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_14T23_34_58.336806", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T23-34-58.336806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T23-34-58.336806.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_14T23_34_58.336806", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T23-34-58.336806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-14T23-34-58.336806.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_14T23_34_58.336806", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T23-34-58.336806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-14T23-34-58.336806.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_14T23_34_58.336806", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T23-34-58.336806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T23-34-58.336806.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_14T23_34_58.336806", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T23-34-58.336806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T23-34-58.336806.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_14T23_34_58.336806", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T23-34-58.336806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-14T23-34-58.336806.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_14T23_34_58.336806", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T23-34-58.336806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T23-34-58.336806.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_14T23_34_58.336806", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T23-34-58.336806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T23-34-58.336806.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_14T23_34_58.336806", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T23-34-58.336806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T23-34-58.336806.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_14T23_34_58.336806", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T23-34-58.336806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T23-34-58.336806.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_14T23_34_58.336806", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T23-34-58.336806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-14T23-34-58.336806.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_14T23_34_58.336806", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T23-34-58.336806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-14T23-34-58.336806.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_14T23_34_58.336806", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T23-34-58.336806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T23-34-58.336806.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_14T23_34_58.336806", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T23-34-58.336806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-14T23-34-58.336806.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_14T23_34_58.336806", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T23-34-58.336806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T23-34-58.336806.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_14T23_34_58.336806", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T23-34-58.336806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T23-34-58.336806.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_14T23_34_58.336806", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T23-34-58.336806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T23-34-58.336806.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_14T23_34_58.336806", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T23-34-58.336806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-14T23-34-58.336806.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_14T23_34_58.336806", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T23-34-58.336806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T23-34-58.336806.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_14T23_34_58.336806", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T23-34-58.336806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T23-34-58.336806.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_14T23_34_58.336806", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T23-34-58.336806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T23-34-58.336806.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_14T23_34_58.336806", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T23-34-58.336806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T23-34-58.336806.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_14T23_34_58.336806", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T23-34-58.336806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T23-34-58.336806.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_14T23_34_58.336806", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T23-34-58.336806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T23-34-58.336806.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_14T23_34_58.336806", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T23-34-58.336806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T23-34-58.336806.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_14T23_34_58.336806", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T23-34-58.336806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T23-34-58.336806.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_14T23_34_58.336806", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T23-34-58.336806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T23-34-58.336806.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_14T23_34_58.336806", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T23-34-58.336806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T23-34-58.336806.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_14T23_34_58.336806", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T23-34-58.336806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T23-34-58.336806.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_14T23_34_58.336806", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T23-34-58.336806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T23-34-58.336806.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_14T23_34_58.336806", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T23-34-58.336806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T23-34-58.336806.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_14T23_34_58.336806", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T23-34-58.336806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T23-34-58.336806.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_14T23_34_58.336806", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T23-34-58.336806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-14T23-34-58.336806.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_14T23_34_58.336806", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T23-34-58.336806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T23-34-58.336806.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_14T23_34_58.336806", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T23-34-58.336806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-14T23-34-58.336806.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_14T23_34_58.336806", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T23-34-58.336806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T23-34-58.336806.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_14T23_34_58.336806", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T23-34-58.336806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T23-34-58.336806.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_14T23_34_58.336806", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T23-34-58.336806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T23-34-58.336806.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_14T23_34_58.336806", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T23-34-58.336806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-14T23-34-58.336806.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_14T23_34_58.336806", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T23-34-58.336806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-14T23-34-58.336806.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_14T23_34_58.336806", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T23-34-58.336806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T23-34-58.336806.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_14T23_34_58.336806", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T23-34-58.336806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T23-34-58.336806.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_14T23_34_58.336806", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T23-34-58.336806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T23-34-58.336806.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_14T23_34_58.336806", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T23-34-58.336806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T23-34-58.336806.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_14T23_34_58.336806", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T23-34-58.336806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-14T23-34-58.336806.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_14T23_34_58.336806", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T23-34-58.336806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-14T23-34-58.336806.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_14T23_34_58.336806", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T23-34-58.336806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-14T23-34-58.336806.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_14T23_34_58.336806", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T23-34-58.336806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T23-34-58.336806.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_14T23_34_58.336806", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T23-34-58.336806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-14T23-34-58.336806.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_14T23_34_58.336806", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T23-34-58.336806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T23-34-58.336806.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_14T23_34_58.336806", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T23-34-58.336806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T23-34-58.336806.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_14T23_34_58.336806", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T23-34-58.336806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-14T23-34-58.336806.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_14T23_34_58.336806", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T23-34-58.336806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-14T23-34-58.336806.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_14T23_34_58.336806", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T23-34-58.336806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-14T23-34-58.336806.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_14T23_34_58.336806", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T23-34-58.336806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T23-34-58.336806.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_14T23_34_58.336806", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T23-34-58.336806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-14T23-34-58.336806.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_14T23_34_58.336806", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T23-34-58.336806.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-14T23-34-58.336806.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_14T23_34_58.336806", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T23-34-58.336806.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-14T23-34-58.336806.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_14T23_34_58.336806", "path": ["**/details_harness|winogrande|5_2024-02-14T23-34-58.336806.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-14T23-34-58.336806.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_14T23_34_58.336806", "path": ["results_2024-02-14T23-34-58.336806.parquet"]}, {"split": "latest", "path": ["results_2024-02-14T23-34-58.336806.parquet"]}]}]} | 2024-02-14T23:37:40+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of touqir/Cyrax-7B
Dataset automatically created during the evaluation run of model touqir/Cyrax-7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-14T23:34:58.336806(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of touqir/Cyrax-7B\n\n\n\nDataset automatically created during the evaluation run of model touqir/Cyrax-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T23:34:58.336806(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of touqir/Cyrax-7B\n\n\n\nDataset automatically created during the evaluation run of model touqir/Cyrax-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-14T23:34:58.336806(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
80c97bd973f2b7a7d26f192f78ffe795eb733a67 |
# Dataset Card for Evaluation run of Xenon1/Zenith-7B-dpo-v1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Xenon1/Zenith-7B-dpo-v1](https://huggingface.co/Xenon1/Zenith-7B-dpo-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Xenon1__Zenith-7B-dpo-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-15T00:49:59.820976](https://huggingface.co/datasets/open-llm-leaderboard/details_Xenon1__Zenith-7B-dpo-v1/blob/main/results_2024-02-15T00-49-59.820976.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5994007032450553,
"acc_stderr": 0.03314392404148924,
"acc_norm": 0.6077867814262741,
"acc_norm_stderr": 0.033870966769135216,
"mc1": 0.4357405140758874,
"mc1_stderr": 0.017358345398863127,
"mc2": 0.6059869573691794,
"mc2_stderr": 0.015948076495091498
},
"harness|arc:challenge|25": {
"acc": 0.5554607508532423,
"acc_stderr": 0.014521226405627082,
"acc_norm": 0.6049488054607508,
"acc_norm_stderr": 0.014285898292938163
},
"harness|hellaswag|10": {
"acc": 0.6405098585939056,
"acc_stderr": 0.004788703173474748,
"acc_norm": 0.8295160326628161,
"acc_norm_stderr": 0.003752888662249574
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.631578947368421,
"acc_stderr": 0.03925523381052932,
"acc_norm": 0.631578947368421,
"acc_norm_stderr": 0.03925523381052932
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6867924528301886,
"acc_stderr": 0.028544793319055326,
"acc_norm": 0.6867924528301886,
"acc_norm_stderr": 0.028544793319055326
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6597222222222222,
"acc_stderr": 0.039621355734862175,
"acc_norm": 0.6597222222222222,
"acc_norm_stderr": 0.039621355734862175
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5549132947976878,
"acc_stderr": 0.03789401760283647,
"acc_norm": 0.5549132947976878,
"acc_norm_stderr": 0.03789401760283647
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909281,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909281
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5276595744680851,
"acc_stderr": 0.03263597118409769,
"acc_norm": 0.5276595744680851,
"acc_norm_stderr": 0.03263597118409769
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4298245614035088,
"acc_stderr": 0.04657047260594963,
"acc_norm": 0.4298245614035088,
"acc_norm_stderr": 0.04657047260594963
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.041227371113703316,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.041227371113703316
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.025197101074246494,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.025197101074246494
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.0442626668137991,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.0442626668137991
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7,
"acc_stderr": 0.026069362295335137,
"acc_norm": 0.7,
"acc_norm_stderr": 0.026069362295335137
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.47783251231527096,
"acc_stderr": 0.03514528562175008,
"acc_norm": 0.47783251231527096,
"acc_norm_stderr": 0.03514528562175008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7151515151515152,
"acc_stderr": 0.03524390844511781,
"acc_norm": 0.7151515151515152,
"acc_norm_stderr": 0.03524390844511781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.02886977846026704,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.02886977846026704
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8290155440414507,
"acc_stderr": 0.027171213683164525,
"acc_norm": 0.8290155440414507,
"acc_norm_stderr": 0.027171213683164525
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5871794871794872,
"acc_stderr": 0.024962683564331796,
"acc_norm": 0.5871794871794872,
"acc_norm_stderr": 0.024962683564331796
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.029045600290616265,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.029045600290616265
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.634453781512605,
"acc_stderr": 0.031282177063684614,
"acc_norm": 0.634453781512605,
"acc_norm_stderr": 0.031282177063684614
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2781456953642384,
"acc_stderr": 0.03658603262763743,
"acc_norm": 0.2781456953642384,
"acc_norm_stderr": 0.03658603262763743
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7889908256880734,
"acc_stderr": 0.01749392240411265,
"acc_norm": 0.7889908256880734,
"acc_norm_stderr": 0.01749392240411265
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4398148148148148,
"acc_stderr": 0.03385177976044811,
"acc_norm": 0.4398148148148148,
"acc_norm_stderr": 0.03385177976044811
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.75,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.026361651668389094,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.026361651668389094
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.03149384670994131,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.03149384670994131
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6946564885496184,
"acc_stderr": 0.040393149787245605,
"acc_norm": 0.6946564885496184,
"acc_norm_stderr": 0.040393149787245605
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.04489931073591312,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.04489931073591312
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6993865030674846,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.6993865030674846,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281344,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281344
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7726692209450831,
"acc_stderr": 0.014987270640946002,
"acc_norm": 0.7726692209450831,
"acc_norm_stderr": 0.014987270640946002
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6445086705202312,
"acc_stderr": 0.025770292082977247,
"acc_norm": 0.6445086705202312,
"acc_norm_stderr": 0.025770292082977247
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.293854748603352,
"acc_stderr": 0.015235075776719608,
"acc_norm": 0.293854748603352,
"acc_norm_stderr": 0.015235075776719608
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6862745098039216,
"acc_stderr": 0.02656892101545715,
"acc_norm": 0.6862745098039216,
"acc_norm_stderr": 0.02656892101545715
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6688102893890675,
"acc_stderr": 0.026730620728004913,
"acc_norm": 0.6688102893890675,
"acc_norm_stderr": 0.026730620728004913
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6635802469135802,
"acc_stderr": 0.02628973494595293,
"acc_norm": 0.6635802469135802,
"acc_norm_stderr": 0.02628973494595293
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.43617021276595747,
"acc_stderr": 0.02958345203628407,
"acc_norm": 0.43617021276595747,
"acc_norm_stderr": 0.02958345203628407
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4067796610169492,
"acc_stderr": 0.012546325596569525,
"acc_norm": 0.4067796610169492,
"acc_norm_stderr": 0.012546325596569525
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.625,
"acc_stderr": 0.029408372932278746,
"acc_norm": 0.625,
"acc_norm_stderr": 0.029408372932278746
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.619281045751634,
"acc_stderr": 0.019643801557924803,
"acc_norm": 0.619281045751634,
"acc_norm_stderr": 0.019643801557924803
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.04494290866252091,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.04494290866252091
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.689795918367347,
"acc_stderr": 0.029613459872484378,
"acc_norm": 0.689795918367347,
"acc_norm_stderr": 0.029613459872484378
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7512437810945274,
"acc_stderr": 0.030567675938916718,
"acc_norm": 0.7512437810945274,
"acc_norm_stderr": 0.030567675938916718
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5060240963855421,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.5060240963855421,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.030611116557432528,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.030611116557432528
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4357405140758874,
"mc1_stderr": 0.017358345398863127,
"mc2": 0.6059869573691794,
"mc2_stderr": 0.015948076495091498
},
"harness|winogrande|5": {
"acc": 0.7726913970007893,
"acc_stderr": 0.011778612167091087
},
"harness|gsm8k|5": {
"acc": 0.16982562547384383,
"acc_stderr": 0.01034257236086122
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Xenon1__Zenith-7B-dpo-v1 | [
"region:us"
] | 2024-02-15T00:45:46+00:00 | {"pretty_name": "Evaluation run of Xenon1/Zenith-7B-dpo-v1", "dataset_summary": "Dataset automatically created during the evaluation run of model [Xenon1/Zenith-7B-dpo-v1](https://huggingface.co/Xenon1/Zenith-7B-dpo-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Xenon1__Zenith-7B-dpo-v1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-15T00:49:59.820976](https://huggingface.co/datasets/open-llm-leaderboard/details_Xenon1__Zenith-7B-dpo-v1/blob/main/results_2024-02-15T00-49-59.820976.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5994007032450553,\n \"acc_stderr\": 0.03314392404148924,\n \"acc_norm\": 0.6077867814262741,\n \"acc_norm_stderr\": 0.033870966769135216,\n \"mc1\": 0.4357405140758874,\n \"mc1_stderr\": 0.017358345398863127,\n \"mc2\": 0.6059869573691794,\n \"mc2_stderr\": 0.015948076495091498\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5554607508532423,\n \"acc_stderr\": 0.014521226405627082,\n \"acc_norm\": 0.6049488054607508,\n \"acc_norm_stderr\": 0.014285898292938163\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6405098585939056,\n \"acc_stderr\": 0.004788703173474748,\n \"acc_norm\": 0.8295160326628161,\n \"acc_norm_stderr\": 0.003752888662249574\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.631578947368421,\n \"acc_stderr\": 0.03925523381052932,\n \"acc_norm\": 0.631578947368421,\n \"acc_norm_stderr\": 0.03925523381052932\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.028544793319055326,\n \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.028544793319055326\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6597222222222222,\n \"acc_stderr\": 0.039621355734862175,\n \"acc_norm\": 0.6597222222222222,\n \"acc_norm_stderr\": 0.039621355734862175\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5549132947976878,\n \"acc_stderr\": 0.03789401760283647,\n \"acc_norm\": 0.5549132947976878,\n \"acc_norm_stderr\": 0.03789401760283647\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909281,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909281\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5276595744680851,\n \"acc_stderr\": 0.03263597118409769,\n \"acc_norm\": 0.5276595744680851,\n \"acc_norm_stderr\": 0.03263597118409769\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4298245614035088,\n \"acc_stderr\": 0.04657047260594963,\n \"acc_norm\": 0.4298245614035088,\n \"acc_norm_stderr\": 0.04657047260594963\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.041227371113703316,\n \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.041227371113703316\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3968253968253968,\n \"acc_stderr\": 0.025197101074246494,\n \"acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.025197101074246494\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.0442626668137991,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.0442626668137991\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.026069362295335137,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.026069362295335137\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.47783251231527096,\n \"acc_stderr\": 0.03514528562175008,\n \"acc_norm\": 0.47783251231527096,\n \"acc_norm_stderr\": 0.03514528562175008\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7151515151515152,\n \"acc_stderr\": 0.03524390844511781,\n \"acc_norm\": 0.7151515151515152,\n \"acc_norm_stderr\": 0.03524390844511781\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7929292929292929,\n \"acc_stderr\": 0.02886977846026704,\n \"acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.02886977846026704\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8290155440414507,\n \"acc_stderr\": 0.027171213683164525,\n \"acc_norm\": 0.8290155440414507,\n \"acc_norm_stderr\": 0.027171213683164525\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5871794871794872,\n \"acc_stderr\": 0.024962683564331796,\n \"acc_norm\": 0.5871794871794872,\n \"acc_norm_stderr\": 0.024962683564331796\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616265,\n \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616265\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.634453781512605,\n \"acc_stderr\": 0.031282177063684614,\n \"acc_norm\": 0.634453781512605,\n \"acc_norm_stderr\": 0.031282177063684614\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763743,\n \"acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763743\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7889908256880734,\n \"acc_stderr\": 0.01749392240411265,\n \"acc_norm\": 0.7889908256880734,\n \"acc_norm_stderr\": 0.01749392240411265\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4398148148148148,\n \"acc_stderr\": 0.03385177976044811,\n \"acc_norm\": 0.4398148148148148,\n \"acc_norm_stderr\": 0.03385177976044811\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7932489451476793,\n \"acc_stderr\": 0.026361651668389094,\n \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.026361651668389094\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6946564885496184,\n \"acc_stderr\": 0.040393149787245605,\n \"acc_norm\": 0.6946564885496184,\n \"acc_norm_stderr\": 0.040393149787245605\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6851851851851852,\n \"acc_stderr\": 0.04489931073591312,\n \"acc_norm\": 0.6851851851851852,\n \"acc_norm_stderr\": 0.04489931073591312\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6993865030674846,\n \"acc_stderr\": 0.03602511318806771,\n \"acc_norm\": 0.6993865030674846,\n \"acc_norm_stderr\": 0.03602511318806771\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n \"acc_stderr\": 0.021586494001281344,\n \"acc_norm\": 0.8760683760683761,\n \"acc_norm_stderr\": 0.021586494001281344\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7726692209450831,\n \"acc_stderr\": 0.014987270640946002,\n \"acc_norm\": 0.7726692209450831,\n \"acc_norm_stderr\": 0.014987270640946002\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6445086705202312,\n \"acc_stderr\": 0.025770292082977247,\n \"acc_norm\": 0.6445086705202312,\n \"acc_norm_stderr\": 0.025770292082977247\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.293854748603352,\n \"acc_stderr\": 0.015235075776719608,\n \"acc_norm\": 0.293854748603352,\n \"acc_norm_stderr\": 0.015235075776719608\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6862745098039216,\n \"acc_stderr\": 0.02656892101545715,\n \"acc_norm\": 0.6862745098039216,\n \"acc_norm_stderr\": 0.02656892101545715\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6688102893890675,\n \"acc_stderr\": 0.026730620728004913,\n \"acc_norm\": 0.6688102893890675,\n \"acc_norm_stderr\": 0.026730620728004913\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6635802469135802,\n \"acc_stderr\": 0.02628973494595293,\n \"acc_norm\": 0.6635802469135802,\n \"acc_norm_stderr\": 0.02628973494595293\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.43617021276595747,\n \"acc_stderr\": 0.02958345203628407,\n \"acc_norm\": 0.43617021276595747,\n \"acc_norm_stderr\": 0.02958345203628407\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4067796610169492,\n \"acc_stderr\": 0.012546325596569525,\n \"acc_norm\": 0.4067796610169492,\n \"acc_norm_stderr\": 0.012546325596569525\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.625,\n \"acc_stderr\": 0.029408372932278746,\n \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.029408372932278746\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.619281045751634,\n \"acc_stderr\": 0.019643801557924803,\n \"acc_norm\": 0.619281045751634,\n \"acc_norm_stderr\": 0.019643801557924803\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.04494290866252091,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.04494290866252091\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.689795918367347,\n \"acc_stderr\": 0.029613459872484378,\n \"acc_norm\": 0.689795918367347,\n \"acc_norm_stderr\": 0.029613459872484378\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7512437810945274,\n \"acc_stderr\": 0.030567675938916718,\n \"acc_norm\": 0.7512437810945274,\n \"acc_norm_stderr\": 0.030567675938916718\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653694,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653694\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.030611116557432528,\n \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.030611116557432528\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4357405140758874,\n \"mc1_stderr\": 0.017358345398863127,\n \"mc2\": 0.6059869573691794,\n \"mc2_stderr\": 0.015948076495091498\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7726913970007893,\n \"acc_stderr\": 0.011778612167091087\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.16982562547384383,\n \"acc_stderr\": 0.01034257236086122\n }\n}\n```", "repo_url": "https://huggingface.co/Xenon1/Zenith-7B-dpo-v1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_15T00_43_26.430787", "path": ["**/details_harness|arc:challenge|25_2024-02-15T00-43-26.430787.parquet"]}, {"split": "2024_02_15T00_49_59.820976", "path": ["**/details_harness|arc:challenge|25_2024-02-15T00-49-59.820976.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-15T00-49-59.820976.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_15T00_43_26.430787", "path": ["**/details_harness|gsm8k|5_2024-02-15T00-43-26.430787.parquet"]}, {"split": "2024_02_15T00_49_59.820976", "path": ["**/details_harness|gsm8k|5_2024-02-15T00-49-59.820976.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-15T00-49-59.820976.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_15T00_43_26.430787", "path": ["**/details_harness|hellaswag|10_2024-02-15T00-43-26.430787.parquet"]}, {"split": "2024_02_15T00_49_59.820976", "path": ["**/details_harness|hellaswag|10_2024-02-15T00-49-59.820976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-15T00-49-59.820976.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_15T00_43_26.430787", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T00-43-26.430787.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-15T00-43-26.430787.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-15T00-43-26.430787.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T00-43-26.430787.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T00-43-26.430787.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-15T00-43-26.430787.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T00-43-26.430787.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T00-43-26.430787.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T00-43-26.430787.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T00-43-26.430787.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-15T00-43-26.430787.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-15T00-43-26.430787.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T00-43-26.430787.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-15T00-43-26.430787.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T00-43-26.430787.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T00-43-26.430787.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T00-43-26.430787.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-15T00-43-26.430787.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T00-43-26.430787.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T00-43-26.430787.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T00-43-26.430787.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T00-43-26.430787.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T00-43-26.430787.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T00-43-26.430787.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T00-43-26.430787.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T00-43-26.430787.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T00-43-26.430787.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T00-43-26.430787.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T00-43-26.430787.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T00-43-26.430787.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T00-43-26.430787.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T00-43-26.430787.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-15T00-43-26.430787.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T00-43-26.430787.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-15T00-43-26.430787.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T00-43-26.430787.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T00-43-26.430787.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T00-43-26.430787.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-15T00-43-26.430787.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-15T00-43-26.430787.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T00-43-26.430787.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T00-43-26.430787.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T00-43-26.430787.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T00-43-26.430787.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-15T00-43-26.430787.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-15T00-43-26.430787.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-15T00-43-26.430787.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T00-43-26.430787.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-15T00-43-26.430787.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T00-43-26.430787.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T00-43-26.430787.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-15T00-43-26.430787.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-15T00-43-26.430787.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-15T00-43-26.430787.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T00-43-26.430787.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-15T00-43-26.430787.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-15T00-43-26.430787.parquet"]}, {"split": "2024_02_15T00_49_59.820976", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-15T00-49-59.820976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-15T00-49-59.820976.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-15T00-49-59.820976.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_15T00_43_26.430787", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T00-43-26.430787.parquet"]}, {"split": "2024_02_15T00_49_59.820976", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T00-49-59.820976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T00-49-59.820976.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_15T00_43_26.430787", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-15T00-43-26.430787.parquet"]}, {"split": "2024_02_15T00_49_59.820976", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-15T00-49-59.820976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-15T00-49-59.820976.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_15T00_43_26.430787", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-15T00-43-26.430787.parquet"]}, {"split": "2024_02_15T00_49_59.820976", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-15T00-49-59.820976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-15T00-49-59.820976.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_15T00_43_26.430787", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T00-43-26.430787.parquet"]}, {"split": "2024_02_15T00_49_59.820976", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T00-49-59.820976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T00-49-59.820976.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_15T00_43_26.430787", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T00-43-26.430787.parquet"]}, {"split": "2024_02_15T00_49_59.820976", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T00-49-59.820976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T00-49-59.820976.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_15T00_43_26.430787", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-15T00-43-26.430787.parquet"]}, {"split": "2024_02_15T00_49_59.820976", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-15T00-49-59.820976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-15T00-49-59.820976.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_15T00_43_26.430787", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T00-43-26.430787.parquet"]}, {"split": "2024_02_15T00_49_59.820976", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T00-49-59.820976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T00-49-59.820976.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_15T00_43_26.430787", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T00-43-26.430787.parquet"]}, {"split": "2024_02_15T00_49_59.820976", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T00-49-59.820976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T00-49-59.820976.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_15T00_43_26.430787", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T00-43-26.430787.parquet"]}, {"split": "2024_02_15T00_49_59.820976", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T00-49-59.820976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T00-49-59.820976.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_15T00_43_26.430787", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T00-43-26.430787.parquet"]}, {"split": "2024_02_15T00_49_59.820976", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T00-49-59.820976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T00-49-59.820976.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_15T00_43_26.430787", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-15T00-43-26.430787.parquet"]}, {"split": "2024_02_15T00_49_59.820976", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-15T00-49-59.820976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-15T00-49-59.820976.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_15T00_43_26.430787", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-15T00-43-26.430787.parquet"]}, {"split": "2024_02_15T00_49_59.820976", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-15T00-49-59.820976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-15T00-49-59.820976.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_15T00_43_26.430787", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T00-43-26.430787.parquet"]}, {"split": "2024_02_15T00_49_59.820976", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T00-49-59.820976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T00-49-59.820976.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_15T00_43_26.430787", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-15T00-43-26.430787.parquet"]}, {"split": "2024_02_15T00_49_59.820976", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-15T00-49-59.820976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-15T00-49-59.820976.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_15T00_43_26.430787", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T00-43-26.430787.parquet"]}, {"split": "2024_02_15T00_49_59.820976", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T00-49-59.820976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T00-49-59.820976.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_15T00_43_26.430787", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T00-43-26.430787.parquet"]}, {"split": "2024_02_15T00_49_59.820976", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T00-49-59.820976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T00-49-59.820976.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_15T00_43_26.430787", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T00-43-26.430787.parquet"]}, {"split": "2024_02_15T00_49_59.820976", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T00-49-59.820976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T00-49-59.820976.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_15T00_43_26.430787", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-15T00-43-26.430787.parquet"]}, {"split": "2024_02_15T00_49_59.820976", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-15T00-49-59.820976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-15T00-49-59.820976.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_15T00_43_26.430787", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T00-43-26.430787.parquet"]}, {"split": "2024_02_15T00_49_59.820976", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T00-49-59.820976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T00-49-59.820976.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_15T00_43_26.430787", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T00-43-26.430787.parquet"]}, {"split": "2024_02_15T00_49_59.820976", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T00-49-59.820976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T00-49-59.820976.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_15T00_43_26.430787", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T00-43-26.430787.parquet"]}, {"split": "2024_02_15T00_49_59.820976", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T00-49-59.820976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T00-49-59.820976.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_15T00_43_26.430787", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T00-43-26.430787.parquet"]}, {"split": "2024_02_15T00_49_59.820976", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T00-49-59.820976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T00-49-59.820976.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_15T00_43_26.430787", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T00-43-26.430787.parquet"]}, {"split": "2024_02_15T00_49_59.820976", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T00-49-59.820976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T00-49-59.820976.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_15T00_43_26.430787", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T00-43-26.430787.parquet"]}, {"split": "2024_02_15T00_49_59.820976", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T00-49-59.820976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T00-49-59.820976.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_15T00_43_26.430787", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T00-43-26.430787.parquet"]}, {"split": "2024_02_15T00_49_59.820976", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T00-49-59.820976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T00-49-59.820976.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_15T00_43_26.430787", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T00-43-26.430787.parquet"]}, {"split": "2024_02_15T00_49_59.820976", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T00-49-59.820976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T00-49-59.820976.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_15T00_43_26.430787", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T00-43-26.430787.parquet"]}, {"split": "2024_02_15T00_49_59.820976", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T00-49-59.820976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T00-49-59.820976.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_15T00_43_26.430787", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T00-43-26.430787.parquet"]}, {"split": "2024_02_15T00_49_59.820976", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T00-49-59.820976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T00-49-59.820976.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_15T00_43_26.430787", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T00-43-26.430787.parquet"]}, {"split": "2024_02_15T00_49_59.820976", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T00-49-59.820976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T00-49-59.820976.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_15T00_43_26.430787", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T00-43-26.430787.parquet"]}, {"split": "2024_02_15T00_49_59.820976", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T00-49-59.820976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T00-49-59.820976.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_15T00_43_26.430787", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T00-43-26.430787.parquet"]}, {"split": "2024_02_15T00_49_59.820976", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T00-49-59.820976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T00-49-59.820976.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_15T00_43_26.430787", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T00-43-26.430787.parquet"]}, {"split": "2024_02_15T00_49_59.820976", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T00-49-59.820976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T00-49-59.820976.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_15T00_43_26.430787", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-15T00-43-26.430787.parquet"]}, {"split": "2024_02_15T00_49_59.820976", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-15T00-49-59.820976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-15T00-49-59.820976.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_15T00_43_26.430787", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T00-43-26.430787.parquet"]}, {"split": "2024_02_15T00_49_59.820976", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T00-49-59.820976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T00-49-59.820976.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_15T00_43_26.430787", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-15T00-43-26.430787.parquet"]}, {"split": "2024_02_15T00_49_59.820976", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-15T00-49-59.820976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-15T00-49-59.820976.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_15T00_43_26.430787", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T00-43-26.430787.parquet"]}, {"split": "2024_02_15T00_49_59.820976", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T00-49-59.820976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T00-49-59.820976.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_15T00_43_26.430787", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T00-43-26.430787.parquet"]}, {"split": "2024_02_15T00_49_59.820976", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T00-49-59.820976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T00-49-59.820976.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_15T00_43_26.430787", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T00-43-26.430787.parquet"]}, {"split": "2024_02_15T00_49_59.820976", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T00-49-59.820976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T00-49-59.820976.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_15T00_43_26.430787", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-15T00-43-26.430787.parquet"]}, {"split": "2024_02_15T00_49_59.820976", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-15T00-49-59.820976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-15T00-49-59.820976.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_15T00_43_26.430787", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-15T00-43-26.430787.parquet"]}, {"split": "2024_02_15T00_49_59.820976", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-15T00-49-59.820976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-15T00-49-59.820976.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_15T00_43_26.430787", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T00-43-26.430787.parquet"]}, {"split": "2024_02_15T00_49_59.820976", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T00-49-59.820976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T00-49-59.820976.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_15T00_43_26.430787", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T00-43-26.430787.parquet"]}, {"split": "2024_02_15T00_49_59.820976", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T00-49-59.820976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T00-49-59.820976.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_15T00_43_26.430787", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T00-43-26.430787.parquet"]}, {"split": "2024_02_15T00_49_59.820976", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T00-49-59.820976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T00-49-59.820976.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_15T00_43_26.430787", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T00-43-26.430787.parquet"]}, {"split": "2024_02_15T00_49_59.820976", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T00-49-59.820976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T00-49-59.820976.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_15T00_43_26.430787", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-15T00-43-26.430787.parquet"]}, {"split": "2024_02_15T00_49_59.820976", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-15T00-49-59.820976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-15T00-49-59.820976.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_15T00_43_26.430787", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-15T00-43-26.430787.parquet"]}, {"split": "2024_02_15T00_49_59.820976", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-15T00-49-59.820976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-15T00-49-59.820976.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_15T00_43_26.430787", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-15T00-43-26.430787.parquet"]}, {"split": "2024_02_15T00_49_59.820976", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-15T00-49-59.820976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-15T00-49-59.820976.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_15T00_43_26.430787", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T00-43-26.430787.parquet"]}, {"split": "2024_02_15T00_49_59.820976", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T00-49-59.820976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T00-49-59.820976.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_15T00_43_26.430787", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-15T00-43-26.430787.parquet"]}, {"split": "2024_02_15T00_49_59.820976", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-15T00-49-59.820976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-15T00-49-59.820976.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_15T00_43_26.430787", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T00-43-26.430787.parquet"]}, {"split": "2024_02_15T00_49_59.820976", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T00-49-59.820976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T00-49-59.820976.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_15T00_43_26.430787", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T00-43-26.430787.parquet"]}, {"split": "2024_02_15T00_49_59.820976", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T00-49-59.820976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T00-49-59.820976.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_15T00_43_26.430787", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-15T00-43-26.430787.parquet"]}, {"split": "2024_02_15T00_49_59.820976", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-15T00-49-59.820976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-15T00-49-59.820976.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_15T00_43_26.430787", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-15T00-43-26.430787.parquet"]}, {"split": "2024_02_15T00_49_59.820976", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-15T00-49-59.820976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-15T00-49-59.820976.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_15T00_43_26.430787", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-15T00-43-26.430787.parquet"]}, {"split": "2024_02_15T00_49_59.820976", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-15T00-49-59.820976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-15T00-49-59.820976.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_15T00_43_26.430787", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T00-43-26.430787.parquet"]}, {"split": "2024_02_15T00_49_59.820976", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T00-49-59.820976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T00-49-59.820976.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_15T00_43_26.430787", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-15T00-43-26.430787.parquet"]}, {"split": "2024_02_15T00_49_59.820976", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-15T00-49-59.820976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-15T00-49-59.820976.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_15T00_43_26.430787", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-15T00-43-26.430787.parquet"]}, {"split": "2024_02_15T00_49_59.820976", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-15T00-49-59.820976.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-15T00-49-59.820976.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_15T00_43_26.430787", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-15T00-43-26.430787.parquet"]}, {"split": "2024_02_15T00_49_59.820976", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-15T00-49-59.820976.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-15T00-49-59.820976.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_15T00_43_26.430787", "path": ["**/details_harness|winogrande|5_2024-02-15T00-43-26.430787.parquet"]}, {"split": "2024_02_15T00_49_59.820976", "path": ["**/details_harness|winogrande|5_2024-02-15T00-49-59.820976.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-15T00-49-59.820976.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_15T00_43_26.430787", "path": ["results_2024-02-15T00-43-26.430787.parquet"]}, {"split": "2024_02_15T00_49_59.820976", "path": ["results_2024-02-15T00-49-59.820976.parquet"]}, {"split": "latest", "path": ["results_2024-02-15T00-49-59.820976.parquet"]}]}]} | 2024-02-15T00:52:41+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Xenon1/Zenith-7B-dpo-v1
Dataset automatically created during the evaluation run of model Xenon1/Zenith-7B-dpo-v1 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-15T00:49:59.820976(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Xenon1/Zenith-7B-dpo-v1\n\n\n\nDataset automatically created during the evaluation run of model Xenon1/Zenith-7B-dpo-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-15T00:49:59.820976(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Xenon1/Zenith-7B-dpo-v1\n\n\n\nDataset automatically created during the evaluation run of model Xenon1/Zenith-7B-dpo-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-15T00:49:59.820976(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
47dbd24cb68229d5f796ba31c2e4df305204f52e |
# Dataset Card for Evaluation run of Xenon1/Zenith-7B-dpo
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Xenon1/Zenith-7B-dpo](https://huggingface.co/Xenon1/Zenith-7B-dpo) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Xenon1__Zenith-7B-dpo",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-15T00:56:08.608321](https://huggingface.co/datasets/open-llm-leaderboard/details_Xenon1__Zenith-7B-dpo/blob/main/results_2024-02-15T00-56-08.608321.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6007056516844032,
"acc_stderr": 0.033168788562989236,
"acc_norm": 0.609221428455361,
"acc_norm_stderr": 0.03389796776551389,
"mc1": 0.4394124847001224,
"mc1_stderr": 0.017374520482513707,
"mc2": 0.605026177940713,
"mc2_stderr": 0.01589658250093076
},
"harness|arc:challenge|25": {
"acc": 0.5588737201365188,
"acc_stderr": 0.014509747749064664,
"acc_norm": 0.6092150170648464,
"acc_norm_stderr": 0.014258563880513782
},
"harness|hellaswag|10": {
"acc": 0.6400119498107947,
"acc_stderr": 0.004790155370993449,
"acc_norm": 0.829416450906194,
"acc_norm_stderr": 0.003753759220205055
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6447368421052632,
"acc_stderr": 0.038947344870133176,
"acc_norm": 0.6447368421052632,
"acc_norm_stderr": 0.038947344870133176
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6736111111111112,
"acc_stderr": 0.03921067198982266,
"acc_norm": 0.6736111111111112,
"acc_norm_stderr": 0.03921067198982266
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5491329479768786,
"acc_stderr": 0.0379401267469703,
"acc_norm": 0.5491329479768786,
"acc_norm_stderr": 0.0379401267469703
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082635,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082635
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5361702127659574,
"acc_stderr": 0.032600385118357715,
"acc_norm": 0.5361702127659574,
"acc_norm_stderr": 0.032600385118357715
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.046854730419077895,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.046854730419077895
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3994708994708995,
"acc_stderr": 0.025225450284067877,
"acc_norm": 0.3994708994708995,
"acc_norm_stderr": 0.025225450284067877
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6967741935483871,
"acc_stderr": 0.026148685930671753,
"acc_norm": 0.6967741935483871,
"acc_norm_stderr": 0.026148685930671753
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511657,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511657
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7212121212121212,
"acc_stderr": 0.03501438706296781,
"acc_norm": 0.7212121212121212,
"acc_norm_stderr": 0.03501438706296781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586815,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586815
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8238341968911918,
"acc_stderr": 0.027493504244548057,
"acc_norm": 0.8238341968911918,
"acc_norm_stderr": 0.027493504244548057
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5897435897435898,
"acc_stderr": 0.02493931390694079,
"acc_norm": 0.5897435897435898,
"acc_norm_stderr": 0.02493931390694079
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.028578348365473065,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.028578348365473065
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.031041941304059285,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.031041941304059285
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.037345356767871984,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.037345356767871984
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7963302752293578,
"acc_stderr": 0.01726674208763079,
"acc_norm": 0.7963302752293578,
"acc_norm_stderr": 0.01726674208763079
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4305555555555556,
"acc_stderr": 0.03376922151252336,
"acc_norm": 0.4305555555555556,
"acc_norm_stderr": 0.03376922151252336
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7696078431372549,
"acc_stderr": 0.029554292605695066,
"acc_norm": 0.7696078431372549,
"acc_norm_stderr": 0.029554292605695066
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7890295358649789,
"acc_stderr": 0.02655837250266192,
"acc_norm": 0.7890295358649789,
"acc_norm_stderr": 0.02655837250266192
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6502242152466368,
"acc_stderr": 0.03200736719484503,
"acc_norm": 0.6502242152466368,
"acc_norm_stderr": 0.03200736719484503
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6946564885496184,
"acc_stderr": 0.040393149787245605,
"acc_norm": 0.6946564885496184,
"acc_norm_stderr": 0.040393149787245605
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.044531975073749834,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.044531975073749834
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6993865030674846,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.6993865030674846,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507332,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507332
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7675606641123882,
"acc_stderr": 0.01510455000890572,
"acc_norm": 0.7675606641123882,
"acc_norm_stderr": 0.01510455000890572
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6445086705202312,
"acc_stderr": 0.025770292082977247,
"acc_norm": 0.6445086705202312,
"acc_norm_stderr": 0.025770292082977247
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.28156424581005585,
"acc_stderr": 0.015042290171866118,
"acc_norm": 0.28156424581005585,
"acc_norm_stderr": 0.015042290171866118
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6993464052287581,
"acc_stderr": 0.026256053835718964,
"acc_norm": 0.6993464052287581,
"acc_norm_stderr": 0.026256053835718964
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6752411575562701,
"acc_stderr": 0.026596782287697043,
"acc_norm": 0.6752411575562701,
"acc_norm_stderr": 0.026596782287697043
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6759259259259259,
"acc_stderr": 0.026041766202717156,
"acc_norm": 0.6759259259259259,
"acc_norm_stderr": 0.026041766202717156
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.44680851063829785,
"acc_stderr": 0.029658235097666907,
"acc_norm": 0.44680851063829785,
"acc_norm_stderr": 0.029658235097666907
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4152542372881356,
"acc_stderr": 0.012585471793400659,
"acc_norm": 0.4152542372881356,
"acc_norm_stderr": 0.012585471793400659
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6323529411764706,
"acc_stderr": 0.029289413409403192,
"acc_norm": 0.6323529411764706,
"acc_norm_stderr": 0.029289413409403192
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.630718954248366,
"acc_stderr": 0.019524316744866353,
"acc_norm": 0.630718954248366,
"acc_norm_stderr": 0.019524316744866353
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.04582004841505418,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.04582004841505418
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6612244897959184,
"acc_stderr": 0.030299506562154185,
"acc_norm": 0.6612244897959184,
"acc_norm_stderr": 0.030299506562154185
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.746268656716418,
"acc_stderr": 0.03076944496729602,
"acc_norm": 0.746268656716418,
"acc_norm_stderr": 0.03076944496729602
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5060240963855421,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.5060240963855421,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.030611116557432528,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.030611116557432528
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4394124847001224,
"mc1_stderr": 0.017374520482513707,
"mc2": 0.605026177940713,
"mc2_stderr": 0.01589658250093076
},
"harness|winogrande|5": {
"acc": 0.7726913970007893,
"acc_stderr": 0.011778612167091087
},
"harness|gsm8k|5": {
"acc": 0.16603487490523122,
"acc_stderr": 0.010249811990593532
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Xenon1__Zenith-7B-dpo | [
"region:us"
] | 2024-02-15T00:58:28+00:00 | {"pretty_name": "Evaluation run of Xenon1/Zenith-7B-dpo", "dataset_summary": "Dataset automatically created during the evaluation run of model [Xenon1/Zenith-7B-dpo](https://huggingface.co/Xenon1/Zenith-7B-dpo) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Xenon1__Zenith-7B-dpo\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-15T00:56:08.608321](https://huggingface.co/datasets/open-llm-leaderboard/details_Xenon1__Zenith-7B-dpo/blob/main/results_2024-02-15T00-56-08.608321.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6007056516844032,\n \"acc_stderr\": 0.033168788562989236,\n \"acc_norm\": 0.609221428455361,\n \"acc_norm_stderr\": 0.03389796776551389,\n \"mc1\": 0.4394124847001224,\n \"mc1_stderr\": 0.017374520482513707,\n \"mc2\": 0.605026177940713,\n \"mc2_stderr\": 0.01589658250093076\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5588737201365188,\n \"acc_stderr\": 0.014509747749064664,\n \"acc_norm\": 0.6092150170648464,\n \"acc_norm_stderr\": 0.014258563880513782\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6400119498107947,\n \"acc_stderr\": 0.004790155370993449,\n \"acc_norm\": 0.829416450906194,\n \"acc_norm_stderr\": 0.003753759220205055\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6447368421052632,\n \"acc_stderr\": 0.038947344870133176,\n \"acc_norm\": 0.6447368421052632,\n \"acc_norm_stderr\": 0.038947344870133176\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6736111111111112,\n \"acc_stderr\": 0.03921067198982266,\n \"acc_norm\": 0.6736111111111112,\n \"acc_norm_stderr\": 0.03921067198982266\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5491329479768786,\n \"acc_stderr\": 0.0379401267469703,\n \"acc_norm\": 0.5491329479768786,\n \"acc_norm_stderr\": 0.0379401267469703\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082635,\n \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082635\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5361702127659574,\n \"acc_stderr\": 0.032600385118357715,\n \"acc_norm\": 0.5361702127659574,\n \"acc_norm_stderr\": 0.032600385118357715\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n \"acc_stderr\": 0.046854730419077895,\n \"acc_norm\": 0.45614035087719296,\n \"acc_norm_stderr\": 0.046854730419077895\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3994708994708995,\n \"acc_stderr\": 0.025225450284067877,\n \"acc_norm\": 0.3994708994708995,\n \"acc_norm_stderr\": 0.025225450284067877\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6967741935483871,\n \"acc_stderr\": 0.026148685930671753,\n \"acc_norm\": 0.6967741935483871,\n \"acc_norm_stderr\": 0.026148685930671753\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.035158955511657,\n \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511657\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7212121212121212,\n \"acc_stderr\": 0.03501438706296781,\n \"acc_norm\": 0.7212121212121212,\n \"acc_norm_stderr\": 0.03501438706296781\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8238341968911918,\n \"acc_stderr\": 0.027493504244548057,\n \"acc_norm\": 0.8238341968911918,\n \"acc_norm_stderr\": 0.027493504244548057\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5897435897435898,\n \"acc_stderr\": 0.02493931390694079,\n \"acc_norm\": 0.5897435897435898,\n \"acc_norm_stderr\": 0.02493931390694079\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32592592592592595,\n \"acc_stderr\": 0.028578348365473065,\n \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.028578348365473065\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.031041941304059285,\n \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.031041941304059285\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7963302752293578,\n \"acc_stderr\": 0.01726674208763079,\n \"acc_norm\": 0.7963302752293578,\n \"acc_norm_stderr\": 0.01726674208763079\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4305555555555556,\n \"acc_stderr\": 0.03376922151252336,\n \"acc_norm\": 0.4305555555555556,\n \"acc_norm_stderr\": 0.03376922151252336\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7696078431372549,\n \"acc_stderr\": 0.029554292605695066,\n \"acc_norm\": 0.7696078431372549,\n \"acc_norm_stderr\": 0.029554292605695066\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7890295358649789,\n \"acc_stderr\": 0.02655837250266192,\n \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.02655837250266192\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6502242152466368,\n \"acc_stderr\": 0.03200736719484503,\n \"acc_norm\": 0.6502242152466368,\n \"acc_norm_stderr\": 0.03200736719484503\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6946564885496184,\n \"acc_stderr\": 0.040393149787245605,\n \"acc_norm\": 0.6946564885496184,\n \"acc_norm_stderr\": 0.040393149787245605\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6944444444444444,\n \"acc_stderr\": 0.044531975073749834,\n \"acc_norm\": 0.6944444444444444,\n \"acc_norm_stderr\": 0.044531975073749834\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6993865030674846,\n \"acc_stderr\": 0.03602511318806771,\n \"acc_norm\": 0.6993865030674846,\n \"acc_norm_stderr\": 0.03602511318806771\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n \"acc_stderr\": 0.02190190511507332,\n \"acc_norm\": 0.8717948717948718,\n \"acc_norm_stderr\": 0.02190190511507332\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7675606641123882,\n \"acc_stderr\": 0.01510455000890572,\n \"acc_norm\": 0.7675606641123882,\n \"acc_norm_stderr\": 0.01510455000890572\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6445086705202312,\n \"acc_stderr\": 0.025770292082977247,\n \"acc_norm\": 0.6445086705202312,\n \"acc_norm_stderr\": 0.025770292082977247\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.28156424581005585,\n \"acc_stderr\": 0.015042290171866118,\n \"acc_norm\": 0.28156424581005585,\n \"acc_norm_stderr\": 0.015042290171866118\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6993464052287581,\n \"acc_stderr\": 0.026256053835718964,\n \"acc_norm\": 0.6993464052287581,\n \"acc_norm_stderr\": 0.026256053835718964\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6752411575562701,\n \"acc_stderr\": 0.026596782287697043,\n \"acc_norm\": 0.6752411575562701,\n \"acc_norm_stderr\": 0.026596782287697043\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6759259259259259,\n \"acc_stderr\": 0.026041766202717156,\n \"acc_norm\": 0.6759259259259259,\n \"acc_norm_stderr\": 0.026041766202717156\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.44680851063829785,\n \"acc_stderr\": 0.029658235097666907,\n \"acc_norm\": 0.44680851063829785,\n \"acc_norm_stderr\": 0.029658235097666907\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4152542372881356,\n \"acc_stderr\": 0.012585471793400659,\n \"acc_norm\": 0.4152542372881356,\n \"acc_norm_stderr\": 0.012585471793400659\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6323529411764706,\n \"acc_stderr\": 0.029289413409403192,\n \"acc_norm\": 0.6323529411764706,\n \"acc_norm_stderr\": 0.029289413409403192\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.630718954248366,\n \"acc_stderr\": 0.019524316744866353,\n \"acc_norm\": 0.630718954248366,\n \"acc_norm_stderr\": 0.019524316744866353\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n \"acc_stderr\": 0.04582004841505418,\n \"acc_norm\": 0.6454545454545455,\n \"acc_norm_stderr\": 0.04582004841505418\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6612244897959184,\n \"acc_stderr\": 0.030299506562154185,\n \"acc_norm\": 0.6612244897959184,\n \"acc_norm_stderr\": 0.030299506562154185\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.746268656716418,\n \"acc_stderr\": 0.03076944496729602,\n \"acc_norm\": 0.746268656716418,\n \"acc_norm_stderr\": 0.03076944496729602\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653694,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653694\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.030611116557432528,\n \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.030611116557432528\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4394124847001224,\n \"mc1_stderr\": 0.017374520482513707,\n \"mc2\": 0.605026177940713,\n \"mc2_stderr\": 0.01589658250093076\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7726913970007893,\n \"acc_stderr\": 0.011778612167091087\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.16603487490523122,\n \"acc_stderr\": 0.010249811990593532\n }\n}\n```", "repo_url": "https://huggingface.co/Xenon1/Zenith-7B-dpo", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_15T00_56_08.608321", "path": ["**/details_harness|arc:challenge|25_2024-02-15T00-56-08.608321.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-15T00-56-08.608321.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_15T00_56_08.608321", "path": ["**/details_harness|gsm8k|5_2024-02-15T00-56-08.608321.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-15T00-56-08.608321.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_15T00_56_08.608321", "path": ["**/details_harness|hellaswag|10_2024-02-15T00-56-08.608321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-15T00-56-08.608321.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_15T00_56_08.608321", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-15T00-56-08.608321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-15T00-56-08.608321.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-15T00-56-08.608321.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_15T00_56_08.608321", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T00-56-08.608321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T00-56-08.608321.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_15T00_56_08.608321", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-15T00-56-08.608321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-15T00-56-08.608321.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_15T00_56_08.608321", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-15T00-56-08.608321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-15T00-56-08.608321.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_15T00_56_08.608321", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T00-56-08.608321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T00-56-08.608321.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_15T00_56_08.608321", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T00-56-08.608321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T00-56-08.608321.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_15T00_56_08.608321", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-15T00-56-08.608321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-15T00-56-08.608321.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_15T00_56_08.608321", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T00-56-08.608321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T00-56-08.608321.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_15T00_56_08.608321", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T00-56-08.608321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T00-56-08.608321.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_15T00_56_08.608321", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T00-56-08.608321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T00-56-08.608321.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_15T00_56_08.608321", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T00-56-08.608321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T00-56-08.608321.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_15T00_56_08.608321", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-15T00-56-08.608321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-15T00-56-08.608321.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_15T00_56_08.608321", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-15T00-56-08.608321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-15T00-56-08.608321.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_15T00_56_08.608321", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T00-56-08.608321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T00-56-08.608321.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_15T00_56_08.608321", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-15T00-56-08.608321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-15T00-56-08.608321.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_15T00_56_08.608321", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T00-56-08.608321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T00-56-08.608321.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_15T00_56_08.608321", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T00-56-08.608321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T00-56-08.608321.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_15T00_56_08.608321", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T00-56-08.608321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T00-56-08.608321.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_15T00_56_08.608321", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-15T00-56-08.608321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-15T00-56-08.608321.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_15T00_56_08.608321", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T00-56-08.608321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T00-56-08.608321.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_15T00_56_08.608321", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T00-56-08.608321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T00-56-08.608321.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_15T00_56_08.608321", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T00-56-08.608321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T00-56-08.608321.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_15T00_56_08.608321", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T00-56-08.608321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T00-56-08.608321.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_15T00_56_08.608321", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T00-56-08.608321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T00-56-08.608321.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_15T00_56_08.608321", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T00-56-08.608321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T00-56-08.608321.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_15T00_56_08.608321", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T00-56-08.608321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T00-56-08.608321.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_15T00_56_08.608321", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T00-56-08.608321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T00-56-08.608321.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_15T00_56_08.608321", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T00-56-08.608321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T00-56-08.608321.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_15T00_56_08.608321", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T00-56-08.608321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T00-56-08.608321.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_15T00_56_08.608321", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T00-56-08.608321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T00-56-08.608321.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_15T00_56_08.608321", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T00-56-08.608321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T00-56-08.608321.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_15T00_56_08.608321", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T00-56-08.608321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T00-56-08.608321.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_15T00_56_08.608321", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T00-56-08.608321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T00-56-08.608321.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_15T00_56_08.608321", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-15T00-56-08.608321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-15T00-56-08.608321.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_15T00_56_08.608321", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T00-56-08.608321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T00-56-08.608321.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_15T00_56_08.608321", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-15T00-56-08.608321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-15T00-56-08.608321.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_15T00_56_08.608321", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T00-56-08.608321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T00-56-08.608321.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_15T00_56_08.608321", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T00-56-08.608321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T00-56-08.608321.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_15T00_56_08.608321", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T00-56-08.608321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T00-56-08.608321.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_15T00_56_08.608321", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-15T00-56-08.608321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-15T00-56-08.608321.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_15T00_56_08.608321", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-15T00-56-08.608321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-15T00-56-08.608321.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_15T00_56_08.608321", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T00-56-08.608321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T00-56-08.608321.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_15T00_56_08.608321", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T00-56-08.608321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T00-56-08.608321.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_15T00_56_08.608321", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T00-56-08.608321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T00-56-08.608321.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_15T00_56_08.608321", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T00-56-08.608321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T00-56-08.608321.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_15T00_56_08.608321", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-15T00-56-08.608321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-15T00-56-08.608321.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_15T00_56_08.608321", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-15T00-56-08.608321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-15T00-56-08.608321.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_15T00_56_08.608321", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-15T00-56-08.608321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-15T00-56-08.608321.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_15T00_56_08.608321", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T00-56-08.608321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T00-56-08.608321.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_15T00_56_08.608321", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-15T00-56-08.608321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-15T00-56-08.608321.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_15T00_56_08.608321", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T00-56-08.608321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T00-56-08.608321.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_15T00_56_08.608321", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T00-56-08.608321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T00-56-08.608321.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_15T00_56_08.608321", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-15T00-56-08.608321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-15T00-56-08.608321.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_15T00_56_08.608321", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-15T00-56-08.608321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-15T00-56-08.608321.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_15T00_56_08.608321", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-15T00-56-08.608321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-15T00-56-08.608321.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_15T00_56_08.608321", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T00-56-08.608321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T00-56-08.608321.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_15T00_56_08.608321", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-15T00-56-08.608321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-15T00-56-08.608321.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_15T00_56_08.608321", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-15T00-56-08.608321.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-15T00-56-08.608321.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_15T00_56_08.608321", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-15T00-56-08.608321.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-15T00-56-08.608321.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_15T00_56_08.608321", "path": ["**/details_harness|winogrande|5_2024-02-15T00-56-08.608321.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-15T00-56-08.608321.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_15T00_56_08.608321", "path": ["results_2024-02-15T00-56-08.608321.parquet"]}, {"split": "latest", "path": ["results_2024-02-15T00-56-08.608321.parquet"]}]}]} | 2024-02-15T00:58:52+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Xenon1/Zenith-7B-dpo
Dataset automatically created during the evaluation run of model Xenon1/Zenith-7B-dpo on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-15T00:56:08.608321(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Xenon1/Zenith-7B-dpo\n\n\n\nDataset automatically created during the evaluation run of model Xenon1/Zenith-7B-dpo on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-15T00:56:08.608321(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Xenon1/Zenith-7B-dpo\n\n\n\nDataset automatically created during the evaluation run of model Xenon1/Zenith-7B-dpo on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-15T00:56:08.608321(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
454848fda7d447955a4def67317120b42cbbb4b9 |
# Dataset Card for Evaluation run of Josephgflowers/tinyllama-730M-test
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Josephgflowers/tinyllama-730M-test](https://huggingface.co/Josephgflowers/tinyllama-730M-test) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Josephgflowers__tinyllama-730M-test",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-15T01:02:27.411834](https://huggingface.co/datasets/open-llm-leaderboard/details_Josephgflowers__tinyllama-730M-test/blob/main/results_2024-02-15T01-02-27.411834.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.24533600070728154,
"acc_stderr": 0.030183069676177794,
"acc_norm": 0.2460404616295554,
"acc_norm_stderr": 0.03097824889414466,
"mc1": 0.2607099143206854,
"mc1_stderr": 0.015368841620766375,
"mc2": 0.4290335878750869,
"mc2_stderr": 0.015203430898769523
},
"harness|arc:challenge|25": {
"acc": 0.22525597269624573,
"acc_stderr": 0.012207839995407305,
"acc_norm": 0.2508532423208191,
"acc_norm_stderr": 0.012668198621315433
},
"harness|hellaswag|10": {
"acc": 0.302230631348337,
"acc_stderr": 0.004582861219020891,
"acc_norm": 0.3381796454889464,
"acc_norm_stderr": 0.004721231637092727
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036843,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036843
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.040247784019771096,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.040247784019771096
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.13815789473684212,
"acc_stderr": 0.028081042939576552,
"acc_norm": 0.13815789473684212,
"acc_norm_stderr": 0.028081042939576552
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.15,
"acc_stderr": 0.03588702812826372,
"acc_norm": 0.15,
"acc_norm_stderr": 0.03588702812826372
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21132075471698114,
"acc_stderr": 0.02512576648482784,
"acc_norm": 0.21132075471698114,
"acc_norm_stderr": 0.02512576648482784
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.22916666666666666,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.22916666666666666,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.15,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.15,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.24277456647398843,
"acc_stderr": 0.0326926380614177,
"acc_norm": 0.24277456647398843,
"acc_norm_stderr": 0.0326926380614177
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617746,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617746
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.20425531914893616,
"acc_stderr": 0.026355158413349424,
"acc_norm": 0.20425531914893616,
"acc_norm_stderr": 0.026355158413349424
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.04049339297748141,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.04049339297748141
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.21379310344827587,
"acc_stderr": 0.034165204477475494,
"acc_norm": 0.21379310344827587,
"acc_norm_stderr": 0.034165204477475494
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2566137566137566,
"acc_stderr": 0.022494510767503154,
"acc_norm": 0.2566137566137566,
"acc_norm_stderr": 0.022494510767503154
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.1746031746031746,
"acc_stderr": 0.03395490020856112,
"acc_norm": 0.1746031746031746,
"acc_norm_stderr": 0.03395490020856112
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.18,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.32903225806451614,
"acc_stderr": 0.02672949906834996,
"acc_norm": 0.32903225806451614,
"acc_norm_stderr": 0.02672949906834996
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2955665024630542,
"acc_stderr": 0.032104944337514575,
"acc_norm": 0.2955665024630542,
"acc_norm_stderr": 0.032104944337514575
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.24242424242424243,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.24242424242424243,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.25252525252525254,
"acc_stderr": 0.030954055470365897,
"acc_norm": 0.25252525252525254,
"acc_norm_stderr": 0.030954055470365897
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.26424870466321243,
"acc_stderr": 0.03182155050916646,
"acc_norm": 0.26424870466321243,
"acc_norm_stderr": 0.03182155050916646
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.21794871794871795,
"acc_stderr": 0.020932445774463203,
"acc_norm": 0.21794871794871795,
"acc_norm_stderr": 0.020932445774463203
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2740740740740741,
"acc_stderr": 0.027195934804085622,
"acc_norm": 0.2740740740740741,
"acc_norm_stderr": 0.027195934804085622
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.22268907563025211,
"acc_stderr": 0.027025433498882367,
"acc_norm": 0.22268907563025211,
"acc_norm_stderr": 0.027025433498882367
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2781456953642384,
"acc_stderr": 0.03658603262763743,
"acc_norm": 0.2781456953642384,
"acc_norm_stderr": 0.03658603262763743
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.22385321100917432,
"acc_stderr": 0.01787121776779022,
"acc_norm": 0.22385321100917432,
"acc_norm_stderr": 0.01787121776779022
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4398148148148148,
"acc_stderr": 0.03385177976044811,
"acc_norm": 0.4398148148148148,
"acc_norm_stderr": 0.03385177976044811
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.03132179803083291,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.03132179803083291
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2109704641350211,
"acc_stderr": 0.02655837250266192,
"acc_norm": 0.2109704641350211,
"acc_norm_stderr": 0.02655837250266192
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.23318385650224216,
"acc_stderr": 0.028380391147094713,
"acc_norm": 0.23318385650224216,
"acc_norm_stderr": 0.028380391147094713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.25190839694656486,
"acc_stderr": 0.038073871163060866,
"acc_norm": 0.25190839694656486,
"acc_norm_stderr": 0.038073871163060866
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.35537190082644626,
"acc_stderr": 0.04369236326573981,
"acc_norm": 0.35537190082644626,
"acc_norm_stderr": 0.04369236326573981
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.03755265865037181,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.03755265865037181
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3067484662576687,
"acc_stderr": 0.036230899157241474,
"acc_norm": 0.3067484662576687,
"acc_norm_stderr": 0.036230899157241474
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.24107142857142858,
"acc_stderr": 0.04059867246952687,
"acc_norm": 0.24107142857142858,
"acc_norm_stderr": 0.04059867246952687
},
"harness|hendrycksTest-management|5": {
"acc": 0.1941747572815534,
"acc_stderr": 0.039166677628225836,
"acc_norm": 0.1941747572815534,
"acc_norm_stderr": 0.039166677628225836
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.18803418803418803,
"acc_stderr": 0.025598193686652265,
"acc_norm": 0.18803418803418803,
"acc_norm_stderr": 0.025598193686652265
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.18,
"acc_stderr": 0.03861229196653697,
"acc_norm": 0.18,
"acc_norm_stderr": 0.03861229196653697
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.26309067688378035,
"acc_stderr": 0.015745497169049053,
"acc_norm": 0.26309067688378035,
"acc_norm_stderr": 0.015745497169049053
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2514450867052023,
"acc_stderr": 0.02335736578587403,
"acc_norm": 0.2514450867052023,
"acc_norm_stderr": 0.02335736578587403
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2581699346405229,
"acc_stderr": 0.025058503316958154,
"acc_norm": 0.2581699346405229,
"acc_norm_stderr": 0.025058503316958154
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2990353697749196,
"acc_stderr": 0.026003301117885135,
"acc_norm": 0.2990353697749196,
"acc_norm_stderr": 0.026003301117885135
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.30246913580246915,
"acc_stderr": 0.025557653981868045,
"acc_norm": 0.30246913580246915,
"acc_norm_stderr": 0.025557653981868045
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2624113475177305,
"acc_stderr": 0.026244920349843,
"acc_norm": 0.2624113475177305,
"acc_norm_stderr": 0.026244920349843
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24119947848761408,
"acc_stderr": 0.01092649610203495,
"acc_norm": 0.24119947848761408,
"acc_norm_stderr": 0.01092649610203495
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.21691176470588236,
"acc_stderr": 0.025035845227711257,
"acc_norm": 0.21691176470588236,
"acc_norm_stderr": 0.025035845227711257
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25980392156862747,
"acc_stderr": 0.017740899509177795,
"acc_norm": 0.25980392156862747,
"acc_norm_stderr": 0.017740899509177795
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.24545454545454545,
"acc_stderr": 0.041220665028782834,
"acc_norm": 0.24545454545454545,
"acc_norm_stderr": 0.041220665028782834
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.24081632653061225,
"acc_stderr": 0.027372942201788163,
"acc_norm": 0.24081632653061225,
"acc_norm_stderr": 0.027372942201788163
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.21393034825870647,
"acc_stderr": 0.02899690969332891,
"acc_norm": 0.21393034825870647,
"acc_norm_stderr": 0.02899690969332891
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-virology|5": {
"acc": 0.20481927710843373,
"acc_stderr": 0.03141784291663926,
"acc_norm": 0.20481927710843373,
"acc_norm_stderr": 0.03141784291663926
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.033773102522091945,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.033773102522091945
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2607099143206854,
"mc1_stderr": 0.015368841620766375,
"mc2": 0.4290335878750869,
"mc2_stderr": 0.015203430898769523
},
"harness|winogrande|5": {
"acc": 0.510655090765588,
"acc_stderr": 0.014049294536290403
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Josephgflowers__tinyllama-730M-test | [
"region:us"
] | 2024-02-15T01:04:18+00:00 | {"pretty_name": "Evaluation run of Josephgflowers/tinyllama-730M-test", "dataset_summary": "Dataset automatically created during the evaluation run of model [Josephgflowers/tinyllama-730M-test](https://huggingface.co/Josephgflowers/tinyllama-730M-test) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Josephgflowers__tinyllama-730M-test\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-15T01:02:27.411834](https://huggingface.co/datasets/open-llm-leaderboard/details_Josephgflowers__tinyllama-730M-test/blob/main/results_2024-02-15T01-02-27.411834.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.24533600070728154,\n \"acc_stderr\": 0.030183069676177794,\n \"acc_norm\": 0.2460404616295554,\n \"acc_norm_stderr\": 0.03097824889414466,\n \"mc1\": 0.2607099143206854,\n \"mc1_stderr\": 0.015368841620766375,\n \"mc2\": 0.4290335878750869,\n \"mc2_stderr\": 0.015203430898769523\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.22525597269624573,\n \"acc_stderr\": 0.012207839995407305,\n \"acc_norm\": 0.2508532423208191,\n \"acc_norm_stderr\": 0.012668198621315433\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.302230631348337,\n \"acc_stderr\": 0.004582861219020891,\n \"acc_norm\": 0.3381796454889464,\n \"acc_norm_stderr\": 0.004721231637092727\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036843,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036843\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.31851851851851853,\n \"acc_stderr\": 0.040247784019771096,\n \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.040247784019771096\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.13815789473684212,\n \"acc_stderr\": 0.028081042939576552,\n \"acc_norm\": 0.13815789473684212,\n \"acc_norm_stderr\": 0.028081042939576552\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.15,\n \"acc_stderr\": 0.03588702812826372,\n \"acc_norm\": 0.15,\n \"acc_norm_stderr\": 0.03588702812826372\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.21132075471698114,\n \"acc_stderr\": 0.02512576648482784,\n \"acc_norm\": 0.21132075471698114,\n \"acc_norm_stderr\": 0.02512576648482784\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.22916666666666666,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.22916666666666666,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.15,\n \"acc_stderr\": 0.0358870281282637,\n \"acc_norm\": 0.15,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.24277456647398843,\n \"acc_stderr\": 0.0326926380614177,\n \"acc_norm\": 0.24277456647398843,\n \"acc_norm_stderr\": 0.0326926380614177\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617746,\n \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617746\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.20425531914893616,\n \"acc_stderr\": 0.026355158413349424,\n \"acc_norm\": 0.20425531914893616,\n \"acc_norm_stderr\": 0.026355158413349424\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n \"acc_stderr\": 0.04049339297748141,\n \"acc_norm\": 0.24561403508771928,\n \"acc_norm_stderr\": 0.04049339297748141\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.21379310344827587,\n \"acc_stderr\": 0.034165204477475494,\n \"acc_norm\": 0.21379310344827587,\n \"acc_norm_stderr\": 0.034165204477475494\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.1746031746031746,\n \"acc_stderr\": 0.03395490020856112,\n \"acc_norm\": 0.1746031746031746,\n \"acc_norm_stderr\": 0.03395490020856112\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.03861229196653694,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.03861229196653694\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.32903225806451614,\n \"acc_stderr\": 0.02672949906834996,\n \"acc_norm\": 0.32903225806451614,\n \"acc_norm_stderr\": 0.02672949906834996\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.2955665024630542,\n \"acc_stderr\": 0.032104944337514575,\n \"acc_norm\": 0.2955665024630542,\n \"acc_norm_stderr\": 0.032104944337514575\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.24242424242424243,\n \"acc_stderr\": 0.03346409881055953,\n \"acc_norm\": 0.24242424242424243,\n \"acc_norm_stderr\": 0.03346409881055953\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.25252525252525254,\n \"acc_stderr\": 0.030954055470365897,\n \"acc_norm\": 0.25252525252525254,\n \"acc_norm_stderr\": 0.030954055470365897\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.26424870466321243,\n \"acc_stderr\": 0.03182155050916646,\n \"acc_norm\": 0.26424870466321243,\n \"acc_norm_stderr\": 0.03182155050916646\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.21794871794871795,\n \"acc_stderr\": 0.020932445774463203,\n \"acc_norm\": 0.21794871794871795,\n \"acc_norm_stderr\": 0.020932445774463203\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2740740740740741,\n \"acc_stderr\": 0.027195934804085622,\n \"acc_norm\": 0.2740740740740741,\n \"acc_norm_stderr\": 0.027195934804085622\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.22268907563025211,\n \"acc_stderr\": 0.027025433498882367,\n \"acc_norm\": 0.22268907563025211,\n \"acc_norm_stderr\": 0.027025433498882367\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763743,\n \"acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763743\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.22385321100917432,\n \"acc_stderr\": 0.01787121776779022,\n \"acc_norm\": 0.22385321100917432,\n \"acc_norm_stderr\": 0.01787121776779022\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4398148148148148,\n \"acc_stderr\": 0.03385177976044811,\n \"acc_norm\": 0.4398148148148148,\n \"acc_norm_stderr\": 0.03385177976044811\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.03132179803083291,\n \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.03132179803083291\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.2109704641350211,\n \"acc_stderr\": 0.02655837250266192,\n \"acc_norm\": 0.2109704641350211,\n \"acc_norm_stderr\": 0.02655837250266192\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.23318385650224216,\n \"acc_stderr\": 0.028380391147094713,\n \"acc_norm\": 0.23318385650224216,\n \"acc_norm_stderr\": 0.028380391147094713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.25190839694656486,\n \"acc_stderr\": 0.038073871163060866,\n \"acc_norm\": 0.25190839694656486,\n \"acc_norm_stderr\": 0.038073871163060866\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.35537190082644626,\n \"acc_stderr\": 0.04369236326573981,\n \"acc_norm\": 0.35537190082644626,\n \"acc_norm_stderr\": 0.04369236326573981\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.18518518518518517,\n \"acc_stderr\": 0.03755265865037181,\n \"acc_norm\": 0.18518518518518517,\n \"acc_norm_stderr\": 0.03755265865037181\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.3067484662576687,\n \"acc_stderr\": 0.036230899157241474,\n \"acc_norm\": 0.3067484662576687,\n \"acc_norm_stderr\": 0.036230899157241474\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.24107142857142858,\n \"acc_stderr\": 0.04059867246952687,\n \"acc_norm\": 0.24107142857142858,\n \"acc_norm_stderr\": 0.04059867246952687\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.1941747572815534,\n \"acc_stderr\": 0.039166677628225836,\n \"acc_norm\": 0.1941747572815534,\n \"acc_norm_stderr\": 0.039166677628225836\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.18803418803418803,\n \"acc_stderr\": 0.025598193686652265,\n \"acc_norm\": 0.18803418803418803,\n \"acc_norm_stderr\": 0.025598193686652265\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.03861229196653697,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.03861229196653697\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.26309067688378035,\n \"acc_stderr\": 0.015745497169049053,\n \"acc_norm\": 0.26309067688378035,\n \"acc_norm_stderr\": 0.015745497169049053\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.2514450867052023,\n \"acc_stderr\": 0.02335736578587403,\n \"acc_norm\": 0.2514450867052023,\n \"acc_norm_stderr\": 0.02335736578587403\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.2581699346405229,\n \"acc_stderr\": 0.025058503316958154,\n \"acc_norm\": 0.2581699346405229,\n \"acc_norm_stderr\": 0.025058503316958154\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2990353697749196,\n \"acc_stderr\": 0.026003301117885135,\n \"acc_norm\": 0.2990353697749196,\n \"acc_norm_stderr\": 0.026003301117885135\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.30246913580246915,\n \"acc_stderr\": 0.025557653981868045,\n \"acc_norm\": 0.30246913580246915,\n \"acc_norm_stderr\": 0.025557653981868045\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.2624113475177305,\n \"acc_stderr\": 0.026244920349843,\n \"acc_norm\": 0.2624113475177305,\n \"acc_norm_stderr\": 0.026244920349843\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24119947848761408,\n \"acc_stderr\": 0.01092649610203495,\n \"acc_norm\": 0.24119947848761408,\n \"acc_norm_stderr\": 0.01092649610203495\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.21691176470588236,\n \"acc_stderr\": 0.025035845227711257,\n \"acc_norm\": 0.21691176470588236,\n \"acc_norm_stderr\": 0.025035845227711257\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.25980392156862747,\n \"acc_stderr\": 0.017740899509177795,\n \"acc_norm\": 0.25980392156862747,\n \"acc_norm_stderr\": 0.017740899509177795\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.24545454545454545,\n \"acc_stderr\": 0.041220665028782834,\n \"acc_norm\": 0.24545454545454545,\n \"acc_norm_stderr\": 0.041220665028782834\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.24081632653061225,\n \"acc_stderr\": 0.027372942201788163,\n \"acc_norm\": 0.24081632653061225,\n \"acc_norm_stderr\": 0.027372942201788163\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.21393034825870647,\n \"acc_stderr\": 0.02899690969332891,\n \"acc_norm\": 0.21393034825870647,\n \"acc_norm_stderr\": 0.02899690969332891\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.20481927710843373,\n \"acc_stderr\": 0.03141784291663926,\n \"acc_norm\": 0.20481927710843373,\n \"acc_norm_stderr\": 0.03141784291663926\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.2631578947368421,\n \"acc_stderr\": 0.033773102522091945,\n \"acc_norm\": 0.2631578947368421,\n \"acc_norm_stderr\": 0.033773102522091945\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2607099143206854,\n \"mc1_stderr\": 0.015368841620766375,\n \"mc2\": 0.4290335878750869,\n \"mc2_stderr\": 0.015203430898769523\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.510655090765588,\n \"acc_stderr\": 0.014049294536290403\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/Josephgflowers/tinyllama-730M-test", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_15T01_02_27.411834", "path": ["**/details_harness|arc:challenge|25_2024-02-15T01-02-27.411834.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-15T01-02-27.411834.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_15T01_02_27.411834", "path": ["**/details_harness|gsm8k|5_2024-02-15T01-02-27.411834.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-15T01-02-27.411834.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_15T01_02_27.411834", "path": ["**/details_harness|hellaswag|10_2024-02-15T01-02-27.411834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-15T01-02-27.411834.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_15T01_02_27.411834", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-15T01-02-27.411834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-15T01-02-27.411834.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-15T01-02-27.411834.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_15T01_02_27.411834", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T01-02-27.411834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T01-02-27.411834.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_15T01_02_27.411834", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-15T01-02-27.411834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-15T01-02-27.411834.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_15T01_02_27.411834", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-15T01-02-27.411834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-15T01-02-27.411834.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_15T01_02_27.411834", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T01-02-27.411834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T01-02-27.411834.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_15T01_02_27.411834", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T01-02-27.411834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T01-02-27.411834.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_15T01_02_27.411834", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-15T01-02-27.411834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-15T01-02-27.411834.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_15T01_02_27.411834", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T01-02-27.411834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T01-02-27.411834.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_15T01_02_27.411834", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T01-02-27.411834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T01-02-27.411834.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_15T01_02_27.411834", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T01-02-27.411834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T01-02-27.411834.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_15T01_02_27.411834", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T01-02-27.411834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T01-02-27.411834.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_15T01_02_27.411834", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-15T01-02-27.411834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-15T01-02-27.411834.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_15T01_02_27.411834", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-15T01-02-27.411834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-15T01-02-27.411834.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_15T01_02_27.411834", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T01-02-27.411834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T01-02-27.411834.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_15T01_02_27.411834", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-15T01-02-27.411834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-15T01-02-27.411834.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_15T01_02_27.411834", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T01-02-27.411834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T01-02-27.411834.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_15T01_02_27.411834", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T01-02-27.411834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T01-02-27.411834.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_15T01_02_27.411834", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T01-02-27.411834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T01-02-27.411834.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_15T01_02_27.411834", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-15T01-02-27.411834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-15T01-02-27.411834.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_15T01_02_27.411834", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T01-02-27.411834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T01-02-27.411834.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_15T01_02_27.411834", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T01-02-27.411834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T01-02-27.411834.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_15T01_02_27.411834", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T01-02-27.411834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T01-02-27.411834.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_15T01_02_27.411834", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T01-02-27.411834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T01-02-27.411834.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_15T01_02_27.411834", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T01-02-27.411834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T01-02-27.411834.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_15T01_02_27.411834", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T01-02-27.411834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T01-02-27.411834.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_15T01_02_27.411834", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T01-02-27.411834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T01-02-27.411834.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_15T01_02_27.411834", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T01-02-27.411834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T01-02-27.411834.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_15T01_02_27.411834", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T01-02-27.411834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T01-02-27.411834.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_15T01_02_27.411834", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T01-02-27.411834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T01-02-27.411834.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_15T01_02_27.411834", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T01-02-27.411834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T01-02-27.411834.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_15T01_02_27.411834", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T01-02-27.411834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T01-02-27.411834.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_15T01_02_27.411834", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T01-02-27.411834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T01-02-27.411834.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_15T01_02_27.411834", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T01-02-27.411834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T01-02-27.411834.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_15T01_02_27.411834", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-15T01-02-27.411834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-15T01-02-27.411834.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_15T01_02_27.411834", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T01-02-27.411834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T01-02-27.411834.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_15T01_02_27.411834", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-15T01-02-27.411834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-15T01-02-27.411834.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_15T01_02_27.411834", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T01-02-27.411834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T01-02-27.411834.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_15T01_02_27.411834", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T01-02-27.411834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T01-02-27.411834.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_15T01_02_27.411834", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T01-02-27.411834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T01-02-27.411834.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_15T01_02_27.411834", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-15T01-02-27.411834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-15T01-02-27.411834.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_15T01_02_27.411834", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-15T01-02-27.411834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-15T01-02-27.411834.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_15T01_02_27.411834", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T01-02-27.411834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T01-02-27.411834.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_15T01_02_27.411834", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T01-02-27.411834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T01-02-27.411834.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_15T01_02_27.411834", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T01-02-27.411834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T01-02-27.411834.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_15T01_02_27.411834", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T01-02-27.411834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T01-02-27.411834.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_15T01_02_27.411834", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-15T01-02-27.411834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-15T01-02-27.411834.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_15T01_02_27.411834", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-15T01-02-27.411834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-15T01-02-27.411834.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_15T01_02_27.411834", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-15T01-02-27.411834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-15T01-02-27.411834.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_15T01_02_27.411834", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T01-02-27.411834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T01-02-27.411834.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_15T01_02_27.411834", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-15T01-02-27.411834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-15T01-02-27.411834.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_15T01_02_27.411834", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T01-02-27.411834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T01-02-27.411834.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_15T01_02_27.411834", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T01-02-27.411834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T01-02-27.411834.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_15T01_02_27.411834", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-15T01-02-27.411834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-15T01-02-27.411834.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_15T01_02_27.411834", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-15T01-02-27.411834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-15T01-02-27.411834.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_15T01_02_27.411834", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-15T01-02-27.411834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-15T01-02-27.411834.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_15T01_02_27.411834", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T01-02-27.411834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T01-02-27.411834.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_15T01_02_27.411834", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-15T01-02-27.411834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-15T01-02-27.411834.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_15T01_02_27.411834", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-15T01-02-27.411834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-15T01-02-27.411834.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_15T01_02_27.411834", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-15T01-02-27.411834.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-15T01-02-27.411834.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_15T01_02_27.411834", "path": ["**/details_harness|winogrande|5_2024-02-15T01-02-27.411834.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-15T01-02-27.411834.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_15T01_02_27.411834", "path": ["results_2024-02-15T01-02-27.411834.parquet"]}, {"split": "latest", "path": ["results_2024-02-15T01-02-27.411834.parquet"]}]}]} | 2024-02-15T01:04:42+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Josephgflowers/tinyllama-730M-test
Dataset automatically created during the evaluation run of model Josephgflowers/tinyllama-730M-test on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-15T01:02:27.411834(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Josephgflowers/tinyllama-730M-test\n\n\n\nDataset automatically created during the evaluation run of model Josephgflowers/tinyllama-730M-test on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-15T01:02:27.411834(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Josephgflowers/tinyllama-730M-test\n\n\n\nDataset automatically created during the evaluation run of model Josephgflowers/tinyllama-730M-test on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-15T01:02:27.411834(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
7b19d00cb53ae110d572854758563bd41d6fd0f0 |
# Dataset Card for Evaluation run of BarraHome/Wistral-7B-Instruct-v0.3
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [BarraHome/Wistral-7B-Instruct-v0.3](https://huggingface.co/BarraHome/Wistral-7B-Instruct-v0.3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_BarraHome__Wistral-7B-Instruct-v0.3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-15T19:11:00.348101](https://huggingface.co/datasets/open-llm-leaderboard/details_BarraHome__Wistral-7B-Instruct-v0.3/blob/main/results_2024-02-15T19-11-00.348101.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6029693302455481,
"acc_stderr": 0.0333331469164389,
"acc_norm": 0.6076195134770176,
"acc_norm_stderr": 0.03400970231641362,
"mc1": 0.5214198286413708,
"mc1_stderr": 0.01748743214471164,
"mc2": 0.6762417557185662,
"mc2_stderr": 0.01527040994051319
},
"harness|arc:challenge|25": {
"acc": 0.575938566552901,
"acc_stderr": 0.014441889627464394,
"acc_norm": 0.6220136518771331,
"acc_norm_stderr": 0.0141696645203031
},
"harness|hellaswag|10": {
"acc": 0.6609241187014538,
"acc_stderr": 0.004724281487819376,
"acc_norm": 0.8477394941246763,
"acc_norm_stderr": 0.0035853896364723727
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.625,
"acc_stderr": 0.039397364351956274,
"acc_norm": 0.625,
"acc_norm_stderr": 0.039397364351956274
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.028637235639800893,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.028637235639800893
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6597222222222222,
"acc_stderr": 0.039621355734862175,
"acc_norm": 0.6597222222222222,
"acc_norm_stderr": 0.039621355734862175
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5838150289017341,
"acc_stderr": 0.03758517775404947,
"acc_norm": 0.5838150289017341,
"acc_norm_stderr": 0.03758517775404947
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5106382978723404,
"acc_stderr": 0.03267862331014063,
"acc_norm": 0.5106382978723404,
"acc_norm_stderr": 0.03267862331014063
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4298245614035088,
"acc_stderr": 0.04657047260594963,
"acc_norm": 0.4298245614035088,
"acc_norm_stderr": 0.04657047260594963
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.37566137566137564,
"acc_stderr": 0.024942368931159795,
"acc_norm": 0.37566137566137564,
"acc_norm_stderr": 0.024942368931159795
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.043758884927270605,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.043758884927270605
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6774193548387096,
"acc_stderr": 0.026593084516572277,
"acc_norm": 0.6774193548387096,
"acc_norm_stderr": 0.026593084516572277
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7212121212121212,
"acc_stderr": 0.03501438706296781,
"acc_norm": 0.7212121212121212,
"acc_norm_stderr": 0.03501438706296781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7474747474747475,
"acc_stderr": 0.030954055470365897,
"acc_norm": 0.7474747474747475,
"acc_norm_stderr": 0.030954055470365897
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.844559585492228,
"acc_stderr": 0.026148483469153314,
"acc_norm": 0.844559585492228,
"acc_norm_stderr": 0.026148483469153314
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5564102564102564,
"acc_stderr": 0.0251891498947642,
"acc_norm": 0.5564102564102564,
"acc_norm_stderr": 0.0251891498947642
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028742040903948485,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028742040903948485
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.634453781512605,
"acc_stderr": 0.031282177063684614,
"acc_norm": 0.634453781512605,
"acc_norm_stderr": 0.031282177063684614
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.0386155754625517,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.0386155754625517
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8,
"acc_stderr": 0.017149858514250955,
"acc_norm": 0.8,
"acc_norm_stderr": 0.017149858514250955
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.03388857118502326,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.03388857118502326
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.75,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7510548523206751,
"acc_stderr": 0.028146970599422644,
"acc_norm": 0.7510548523206751,
"acc_norm_stderr": 0.028146970599422644
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6322869955156951,
"acc_stderr": 0.03236198350928275,
"acc_norm": 0.6322869955156951,
"acc_norm_stderr": 0.03236198350928275
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7022900763358778,
"acc_stderr": 0.040103589424622034,
"acc_norm": 0.7022900763358778,
"acc_norm_stderr": 0.040103589424622034
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.04414343666854933,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.04414343666854933
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7300613496932515,
"acc_stderr": 0.03487825168497892,
"acc_norm": 0.7300613496932515,
"acc_norm_stderr": 0.03487825168497892
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.022801382534597552,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.022801382534597552
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.014866821664709581,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.014866821664709581
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6676300578034682,
"acc_stderr": 0.025361168749688225,
"acc_norm": 0.6676300578034682,
"acc_norm_stderr": 0.025361168749688225
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.34972067039106147,
"acc_stderr": 0.01594930879023364,
"acc_norm": 0.34972067039106147,
"acc_norm_stderr": 0.01594930879023364
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6797385620915033,
"acc_stderr": 0.02671611838015685,
"acc_norm": 0.6797385620915033,
"acc_norm_stderr": 0.02671611838015685
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6752411575562701,
"acc_stderr": 0.026596782287697043,
"acc_norm": 0.6752411575562701,
"acc_norm_stderr": 0.026596782287697043
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6790123456790124,
"acc_stderr": 0.025976566010862744,
"acc_norm": 0.6790123456790124,
"acc_norm_stderr": 0.025976566010862744
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.450354609929078,
"acc_stderr": 0.029680105565029036,
"acc_norm": 0.450354609929078,
"acc_norm_stderr": 0.029680105565029036
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.42894393741851367,
"acc_stderr": 0.012640625443067354,
"acc_norm": 0.42894393741851367,
"acc_norm_stderr": 0.012640625443067354
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5992647058823529,
"acc_stderr": 0.029768263528933105,
"acc_norm": 0.5992647058823529,
"acc_norm_stderr": 0.029768263528933105
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.019722058939618068,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.019722058939618068
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.0282638899437846,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.0282638899437846
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7512437810945274,
"acc_stderr": 0.030567675938916714,
"acc_norm": 0.7512437810945274,
"acc_norm_stderr": 0.030567675938916714
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366255,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366255
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835816,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835816
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5214198286413708,
"mc1_stderr": 0.01748743214471164,
"mc2": 0.6762417557185662,
"mc2_stderr": 0.01527040994051319
},
"harness|winogrande|5": {
"acc": 0.7679558011049724,
"acc_stderr": 0.011864149691827938
},
"harness|gsm8k|5": {
"acc": 0.39651250947687644,
"acc_stderr": 0.013474258584033345
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_BarraHome__Wistral-7B-Instruct-v0.3 | [
"region:us"
] | 2024-02-15T01:33:29+00:00 | {"pretty_name": "Evaluation run of BarraHome/Wistral-7B-Instruct-v0.3", "dataset_summary": "Dataset automatically created during the evaluation run of model [BarraHome/Wistral-7B-Instruct-v0.3](https://huggingface.co/BarraHome/Wistral-7B-Instruct-v0.3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BarraHome__Wistral-7B-Instruct-v0.3\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-15T19:11:00.348101](https://huggingface.co/datasets/open-llm-leaderboard/details_BarraHome__Wistral-7B-Instruct-v0.3/blob/main/results_2024-02-15T19-11-00.348101.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6029693302455481,\n \"acc_stderr\": 0.0333331469164389,\n \"acc_norm\": 0.6076195134770176,\n \"acc_norm_stderr\": 0.03400970231641362,\n \"mc1\": 0.5214198286413708,\n \"mc1_stderr\": 0.01748743214471164,\n \"mc2\": 0.6762417557185662,\n \"mc2_stderr\": 0.01527040994051319\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.575938566552901,\n \"acc_stderr\": 0.014441889627464394,\n \"acc_norm\": 0.6220136518771331,\n \"acc_norm_stderr\": 0.0141696645203031\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6609241187014538,\n \"acc_stderr\": 0.004724281487819376,\n \"acc_norm\": 0.8477394941246763,\n \"acc_norm_stderr\": 0.0035853896364723727\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.625,\n \"acc_stderr\": 0.039397364351956274,\n \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.039397364351956274\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.028637235639800893,\n \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.028637235639800893\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6597222222222222,\n \"acc_stderr\": 0.039621355734862175,\n \"acc_norm\": 0.6597222222222222,\n \"acc_norm_stderr\": 0.039621355734862175\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5838150289017341,\n \"acc_stderr\": 0.03758517775404947,\n \"acc_norm\": 0.5838150289017341,\n \"acc_norm_stderr\": 0.03758517775404947\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5106382978723404,\n \"acc_stderr\": 0.03267862331014063,\n \"acc_norm\": 0.5106382978723404,\n \"acc_norm_stderr\": 0.03267862331014063\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4298245614035088,\n \"acc_stderr\": 0.04657047260594963,\n \"acc_norm\": 0.4298245614035088,\n \"acc_norm_stderr\": 0.04657047260594963\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.37566137566137564,\n \"acc_stderr\": 0.024942368931159795,\n \"acc_norm\": 0.37566137566137564,\n \"acc_norm_stderr\": 0.024942368931159795\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n \"acc_stderr\": 0.043758884927270605,\n \"acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.043758884927270605\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6774193548387096,\n \"acc_stderr\": 0.026593084516572277,\n \"acc_norm\": 0.6774193548387096,\n \"acc_norm_stderr\": 0.026593084516572277\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7212121212121212,\n \"acc_stderr\": 0.03501438706296781,\n \"acc_norm\": 0.7212121212121212,\n \"acc_norm_stderr\": 0.03501438706296781\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7474747474747475,\n \"acc_stderr\": 0.030954055470365897,\n \"acc_norm\": 0.7474747474747475,\n \"acc_norm_stderr\": 0.030954055470365897\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.844559585492228,\n \"acc_stderr\": 0.026148483469153314,\n \"acc_norm\": 0.844559585492228,\n \"acc_norm_stderr\": 0.026148483469153314\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5564102564102564,\n \"acc_stderr\": 0.0251891498947642,\n \"acc_norm\": 0.5564102564102564,\n \"acc_norm_stderr\": 0.0251891498947642\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948485,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948485\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.634453781512605,\n \"acc_stderr\": 0.031282177063684614,\n \"acc_norm\": 0.634453781512605,\n \"acc_norm_stderr\": 0.031282177063684614\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.0386155754625517,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.0386155754625517\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.017149858514250955,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.017149858514250955\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.03388857118502326,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.03388857118502326\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7510548523206751,\n \"acc_stderr\": 0.028146970599422644,\n \"acc_norm\": 0.7510548523206751,\n \"acc_norm_stderr\": 0.028146970599422644\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6322869955156951,\n \"acc_stderr\": 0.03236198350928275,\n \"acc_norm\": 0.6322869955156951,\n \"acc_norm_stderr\": 0.03236198350928275\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7022900763358778,\n \"acc_stderr\": 0.040103589424622034,\n \"acc_norm\": 0.7022900763358778,\n \"acc_norm_stderr\": 0.040103589424622034\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7037037037037037,\n \"acc_stderr\": 0.04414343666854933,\n \"acc_norm\": 0.7037037037037037,\n \"acc_norm_stderr\": 0.04414343666854933\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7300613496932515,\n \"acc_stderr\": 0.03487825168497892,\n \"acc_norm\": 0.7300613496932515,\n \"acc_norm_stderr\": 0.03487825168497892\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n \"acc_stderr\": 0.022801382534597552,\n \"acc_norm\": 0.8589743589743589,\n \"acc_norm_stderr\": 0.022801382534597552\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.014866821664709581,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.014866821664709581\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6676300578034682,\n \"acc_stderr\": 0.025361168749688225,\n \"acc_norm\": 0.6676300578034682,\n \"acc_norm_stderr\": 0.025361168749688225\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.34972067039106147,\n \"acc_stderr\": 0.01594930879023364,\n \"acc_norm\": 0.34972067039106147,\n \"acc_norm_stderr\": 0.01594930879023364\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6797385620915033,\n \"acc_stderr\": 0.02671611838015685,\n \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.02671611838015685\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6752411575562701,\n \"acc_stderr\": 0.026596782287697043,\n \"acc_norm\": 0.6752411575562701,\n \"acc_norm_stderr\": 0.026596782287697043\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6790123456790124,\n \"acc_stderr\": 0.025976566010862744,\n \"acc_norm\": 0.6790123456790124,\n \"acc_norm_stderr\": 0.025976566010862744\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.450354609929078,\n \"acc_stderr\": 0.029680105565029036,\n \"acc_norm\": 0.450354609929078,\n \"acc_norm_stderr\": 0.029680105565029036\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.42894393741851367,\n \"acc_stderr\": 0.012640625443067354,\n \"acc_norm\": 0.42894393741851367,\n \"acc_norm_stderr\": 0.012640625443067354\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5992647058823529,\n \"acc_stderr\": 0.029768263528933105,\n \"acc_norm\": 0.5992647058823529,\n \"acc_norm_stderr\": 0.029768263528933105\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.019722058939618068,\n \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.019722058939618068\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.0282638899437846,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.0282638899437846\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7512437810945274,\n \"acc_stderr\": 0.030567675938916714,\n \"acc_norm\": 0.7512437810945274,\n \"acc_norm_stderr\": 0.030567675938916714\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366255,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.039427724440366255\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n \"acc_stderr\": 0.03891364495835816,\n \"acc_norm\": 0.5120481927710844,\n \"acc_norm_stderr\": 0.03891364495835816\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5214198286413708,\n \"mc1_stderr\": 0.01748743214471164,\n \"mc2\": 0.6762417557185662,\n \"mc2_stderr\": 0.01527040994051319\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7679558011049724,\n \"acc_stderr\": 0.011864149691827938\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.39651250947687644,\n \"acc_stderr\": 0.013474258584033345\n }\n}\n```", "repo_url": "https://huggingface.co/BarraHome/Wistral-7B-Instruct-v0.3", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_15T01_31_11.622496", "path": ["**/details_harness|arc:challenge|25_2024-02-15T01-31-11.622496.parquet"]}, {"split": "2024_02_15T19_11_00.348101", "path": ["**/details_harness|arc:challenge|25_2024-02-15T19-11-00.348101.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-15T19-11-00.348101.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_15T01_31_11.622496", "path": ["**/details_harness|gsm8k|5_2024-02-15T01-31-11.622496.parquet"]}, {"split": "2024_02_15T19_11_00.348101", "path": ["**/details_harness|gsm8k|5_2024-02-15T19-11-00.348101.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-15T19-11-00.348101.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_15T01_31_11.622496", "path": ["**/details_harness|hellaswag|10_2024-02-15T01-31-11.622496.parquet"]}, {"split": "2024_02_15T19_11_00.348101", "path": ["**/details_harness|hellaswag|10_2024-02-15T19-11-00.348101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-15T19-11-00.348101.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_15T01_31_11.622496", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T01-31-11.622496.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-15T01-31-11.622496.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-15T01-31-11.622496.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T01-31-11.622496.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T01-31-11.622496.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-15T01-31-11.622496.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T01-31-11.622496.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T01-31-11.622496.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T01-31-11.622496.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T01-31-11.622496.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-15T01-31-11.622496.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-15T01-31-11.622496.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T01-31-11.622496.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-15T01-31-11.622496.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T01-31-11.622496.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T01-31-11.622496.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T01-31-11.622496.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-15T01-31-11.622496.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T01-31-11.622496.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T01-31-11.622496.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T01-31-11.622496.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T01-31-11.622496.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T01-31-11.622496.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T01-31-11.622496.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T01-31-11.622496.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T01-31-11.622496.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T01-31-11.622496.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T01-31-11.622496.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T01-31-11.622496.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T01-31-11.622496.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T01-31-11.622496.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T01-31-11.622496.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-15T01-31-11.622496.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T01-31-11.622496.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-15T01-31-11.622496.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T01-31-11.622496.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T01-31-11.622496.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T01-31-11.622496.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-15T01-31-11.622496.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-15T01-31-11.622496.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T01-31-11.622496.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T01-31-11.622496.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T01-31-11.622496.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T01-31-11.622496.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-15T01-31-11.622496.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-15T01-31-11.622496.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-15T01-31-11.622496.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T01-31-11.622496.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-15T01-31-11.622496.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T01-31-11.622496.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T01-31-11.622496.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-15T01-31-11.622496.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-15T01-31-11.622496.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-15T01-31-11.622496.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T01-31-11.622496.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-15T01-31-11.622496.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-15T01-31-11.622496.parquet"]}, {"split": "2024_02_15T19_11_00.348101", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-15T19-11-00.348101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-15T19-11-00.348101.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-15T19-11-00.348101.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_15T01_31_11.622496", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T01-31-11.622496.parquet"]}, {"split": "2024_02_15T19_11_00.348101", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T19-11-00.348101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T19-11-00.348101.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_15T01_31_11.622496", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-15T01-31-11.622496.parquet"]}, {"split": "2024_02_15T19_11_00.348101", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-15T19-11-00.348101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-15T19-11-00.348101.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_15T01_31_11.622496", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-15T01-31-11.622496.parquet"]}, {"split": "2024_02_15T19_11_00.348101", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-15T19-11-00.348101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-15T19-11-00.348101.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_15T01_31_11.622496", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T01-31-11.622496.parquet"]}, {"split": "2024_02_15T19_11_00.348101", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T19-11-00.348101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T19-11-00.348101.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_15T01_31_11.622496", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T01-31-11.622496.parquet"]}, {"split": "2024_02_15T19_11_00.348101", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T19-11-00.348101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T19-11-00.348101.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_15T01_31_11.622496", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-15T01-31-11.622496.parquet"]}, {"split": "2024_02_15T19_11_00.348101", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-15T19-11-00.348101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-15T19-11-00.348101.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_15T01_31_11.622496", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T01-31-11.622496.parquet"]}, {"split": "2024_02_15T19_11_00.348101", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T19-11-00.348101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T19-11-00.348101.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_15T01_31_11.622496", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T01-31-11.622496.parquet"]}, {"split": "2024_02_15T19_11_00.348101", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T19-11-00.348101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T19-11-00.348101.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_15T01_31_11.622496", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T01-31-11.622496.parquet"]}, {"split": "2024_02_15T19_11_00.348101", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T19-11-00.348101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T19-11-00.348101.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_15T01_31_11.622496", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T01-31-11.622496.parquet"]}, {"split": "2024_02_15T19_11_00.348101", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T19-11-00.348101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T19-11-00.348101.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_15T01_31_11.622496", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-15T01-31-11.622496.parquet"]}, {"split": "2024_02_15T19_11_00.348101", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-15T19-11-00.348101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-15T19-11-00.348101.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_15T01_31_11.622496", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-15T01-31-11.622496.parquet"]}, {"split": "2024_02_15T19_11_00.348101", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-15T19-11-00.348101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-15T19-11-00.348101.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_15T01_31_11.622496", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T01-31-11.622496.parquet"]}, {"split": "2024_02_15T19_11_00.348101", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T19-11-00.348101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T19-11-00.348101.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_15T01_31_11.622496", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-15T01-31-11.622496.parquet"]}, {"split": "2024_02_15T19_11_00.348101", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-15T19-11-00.348101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-15T19-11-00.348101.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_15T01_31_11.622496", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T01-31-11.622496.parquet"]}, {"split": "2024_02_15T19_11_00.348101", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T19-11-00.348101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T19-11-00.348101.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_15T01_31_11.622496", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T01-31-11.622496.parquet"]}, {"split": "2024_02_15T19_11_00.348101", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T19-11-00.348101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T19-11-00.348101.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_15T01_31_11.622496", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T01-31-11.622496.parquet"]}, {"split": "2024_02_15T19_11_00.348101", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T19-11-00.348101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T19-11-00.348101.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_15T01_31_11.622496", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-15T01-31-11.622496.parquet"]}, {"split": "2024_02_15T19_11_00.348101", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-15T19-11-00.348101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-15T19-11-00.348101.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_15T01_31_11.622496", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T01-31-11.622496.parquet"]}, {"split": "2024_02_15T19_11_00.348101", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T19-11-00.348101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T19-11-00.348101.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_15T01_31_11.622496", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T01-31-11.622496.parquet"]}, {"split": "2024_02_15T19_11_00.348101", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T19-11-00.348101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T19-11-00.348101.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_15T01_31_11.622496", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T01-31-11.622496.parquet"]}, {"split": "2024_02_15T19_11_00.348101", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T19-11-00.348101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T19-11-00.348101.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_15T01_31_11.622496", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T01-31-11.622496.parquet"]}, {"split": "2024_02_15T19_11_00.348101", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T19-11-00.348101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T19-11-00.348101.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_15T01_31_11.622496", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T01-31-11.622496.parquet"]}, {"split": "2024_02_15T19_11_00.348101", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T19-11-00.348101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T19-11-00.348101.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_15T01_31_11.622496", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T01-31-11.622496.parquet"]}, {"split": "2024_02_15T19_11_00.348101", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T19-11-00.348101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T19-11-00.348101.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_15T01_31_11.622496", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T01-31-11.622496.parquet"]}, {"split": "2024_02_15T19_11_00.348101", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T19-11-00.348101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T19-11-00.348101.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_15T01_31_11.622496", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T01-31-11.622496.parquet"]}, {"split": "2024_02_15T19_11_00.348101", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T19-11-00.348101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T19-11-00.348101.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_15T01_31_11.622496", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T01-31-11.622496.parquet"]}, {"split": "2024_02_15T19_11_00.348101", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T19-11-00.348101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T19-11-00.348101.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_15T01_31_11.622496", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T01-31-11.622496.parquet"]}, {"split": "2024_02_15T19_11_00.348101", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T19-11-00.348101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T19-11-00.348101.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_15T01_31_11.622496", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T01-31-11.622496.parquet"]}, {"split": "2024_02_15T19_11_00.348101", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T19-11-00.348101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T19-11-00.348101.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_15T01_31_11.622496", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T01-31-11.622496.parquet"]}, {"split": "2024_02_15T19_11_00.348101", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T19-11-00.348101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T19-11-00.348101.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_15T01_31_11.622496", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T01-31-11.622496.parquet"]}, {"split": "2024_02_15T19_11_00.348101", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T19-11-00.348101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T19-11-00.348101.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_15T01_31_11.622496", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T01-31-11.622496.parquet"]}, {"split": "2024_02_15T19_11_00.348101", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T19-11-00.348101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T19-11-00.348101.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_15T01_31_11.622496", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-15T01-31-11.622496.parquet"]}, {"split": "2024_02_15T19_11_00.348101", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-15T19-11-00.348101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-15T19-11-00.348101.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_15T01_31_11.622496", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T01-31-11.622496.parquet"]}, {"split": "2024_02_15T19_11_00.348101", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T19-11-00.348101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T19-11-00.348101.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_15T01_31_11.622496", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-15T01-31-11.622496.parquet"]}, {"split": "2024_02_15T19_11_00.348101", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-15T19-11-00.348101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-15T19-11-00.348101.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_15T01_31_11.622496", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T01-31-11.622496.parquet"]}, {"split": "2024_02_15T19_11_00.348101", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T19-11-00.348101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T19-11-00.348101.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_15T01_31_11.622496", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T01-31-11.622496.parquet"]}, {"split": "2024_02_15T19_11_00.348101", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T19-11-00.348101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T19-11-00.348101.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_15T01_31_11.622496", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T01-31-11.622496.parquet"]}, {"split": "2024_02_15T19_11_00.348101", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T19-11-00.348101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T19-11-00.348101.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_15T01_31_11.622496", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-15T01-31-11.622496.parquet"]}, {"split": "2024_02_15T19_11_00.348101", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-15T19-11-00.348101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-15T19-11-00.348101.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_15T01_31_11.622496", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-15T01-31-11.622496.parquet"]}, {"split": "2024_02_15T19_11_00.348101", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-15T19-11-00.348101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-15T19-11-00.348101.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_15T01_31_11.622496", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T01-31-11.622496.parquet"]}, {"split": "2024_02_15T19_11_00.348101", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T19-11-00.348101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T19-11-00.348101.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_15T01_31_11.622496", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T01-31-11.622496.parquet"]}, {"split": "2024_02_15T19_11_00.348101", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T19-11-00.348101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T19-11-00.348101.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_15T01_31_11.622496", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T01-31-11.622496.parquet"]}, {"split": "2024_02_15T19_11_00.348101", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T19-11-00.348101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T19-11-00.348101.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_15T01_31_11.622496", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T01-31-11.622496.parquet"]}, {"split": "2024_02_15T19_11_00.348101", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T19-11-00.348101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T19-11-00.348101.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_15T01_31_11.622496", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-15T01-31-11.622496.parquet"]}, {"split": "2024_02_15T19_11_00.348101", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-15T19-11-00.348101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-15T19-11-00.348101.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_15T01_31_11.622496", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-15T01-31-11.622496.parquet"]}, {"split": "2024_02_15T19_11_00.348101", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-15T19-11-00.348101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-15T19-11-00.348101.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_15T01_31_11.622496", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-15T01-31-11.622496.parquet"]}, {"split": "2024_02_15T19_11_00.348101", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-15T19-11-00.348101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-15T19-11-00.348101.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_15T01_31_11.622496", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T01-31-11.622496.parquet"]}, {"split": "2024_02_15T19_11_00.348101", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T19-11-00.348101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T19-11-00.348101.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_15T01_31_11.622496", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-15T01-31-11.622496.parquet"]}, {"split": "2024_02_15T19_11_00.348101", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-15T19-11-00.348101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-15T19-11-00.348101.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_15T01_31_11.622496", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T01-31-11.622496.parquet"]}, {"split": "2024_02_15T19_11_00.348101", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T19-11-00.348101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T19-11-00.348101.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_15T01_31_11.622496", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T01-31-11.622496.parquet"]}, {"split": "2024_02_15T19_11_00.348101", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T19-11-00.348101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T19-11-00.348101.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_15T01_31_11.622496", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-15T01-31-11.622496.parquet"]}, {"split": "2024_02_15T19_11_00.348101", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-15T19-11-00.348101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-15T19-11-00.348101.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_15T01_31_11.622496", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-15T01-31-11.622496.parquet"]}, {"split": "2024_02_15T19_11_00.348101", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-15T19-11-00.348101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-15T19-11-00.348101.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_15T01_31_11.622496", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-15T01-31-11.622496.parquet"]}, {"split": "2024_02_15T19_11_00.348101", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-15T19-11-00.348101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-15T19-11-00.348101.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_15T01_31_11.622496", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T01-31-11.622496.parquet"]}, {"split": "2024_02_15T19_11_00.348101", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T19-11-00.348101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T19-11-00.348101.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_15T01_31_11.622496", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-15T01-31-11.622496.parquet"]}, {"split": "2024_02_15T19_11_00.348101", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-15T19-11-00.348101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-15T19-11-00.348101.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_15T01_31_11.622496", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-15T01-31-11.622496.parquet"]}, {"split": "2024_02_15T19_11_00.348101", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-15T19-11-00.348101.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-15T19-11-00.348101.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_15T01_31_11.622496", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-15T01-31-11.622496.parquet"]}, {"split": "2024_02_15T19_11_00.348101", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-15T19-11-00.348101.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-15T19-11-00.348101.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_15T01_31_11.622496", "path": ["**/details_harness|winogrande|5_2024-02-15T01-31-11.622496.parquet"]}, {"split": "2024_02_15T19_11_00.348101", "path": ["**/details_harness|winogrande|5_2024-02-15T19-11-00.348101.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-15T19-11-00.348101.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_15T01_31_11.622496", "path": ["results_2024-02-15T01-31-11.622496.parquet"]}, {"split": "2024_02_15T19_11_00.348101", "path": ["results_2024-02-15T19-11-00.348101.parquet"]}, {"split": "latest", "path": ["results_2024-02-15T19-11-00.348101.parquet"]}]}]} | 2024-02-15T19:13:57+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of BarraHome/Wistral-7B-Instruct-v0.3
Dataset automatically created during the evaluation run of model BarraHome/Wistral-7B-Instruct-v0.3 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-15T19:11:00.348101(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of BarraHome/Wistral-7B-Instruct-v0.3\n\n\n\nDataset automatically created during the evaluation run of model BarraHome/Wistral-7B-Instruct-v0.3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-15T19:11:00.348101(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of BarraHome/Wistral-7B-Instruct-v0.3\n\n\n\nDataset automatically created during the evaluation run of model BarraHome/Wistral-7B-Instruct-v0.3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-15T19:11:00.348101(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
09db98888f6656e95332f19c3acd9396040a1486 |
# Dataset Card for Evaluation run of Xenon1/Eclipse-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Xenon1/Eclipse-7B](https://huggingface.co/Xenon1/Eclipse-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Xenon1__Eclipse-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-15T01:56:20.654560](https://huggingface.co/datasets/open-llm-leaderboard/details_Xenon1__Eclipse-7B/blob/main/results_2024-02-15T01-56-20.654560.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6504871112025731,
"acc_stderr": 0.032106673784384795,
"acc_norm": 0.6520453433995954,
"acc_norm_stderr": 0.032770034329884165,
"mc1": 0.36474908200734396,
"mc1_stderr": 0.01685096106172012,
"mc2": 0.5337238959396524,
"mc2_stderr": 0.014980829261717704
},
"harness|arc:challenge|25": {
"acc": 0.5930034129692833,
"acc_stderr": 0.014356399418009123,
"acc_norm": 0.6254266211604096,
"acc_norm_stderr": 0.014144193471893458
},
"harness|hellaswag|10": {
"acc": 0.6384186417048396,
"acc_stderr": 0.00479476484368527,
"acc_norm": 0.8418641704839673,
"acc_norm_stderr": 0.0036412262941678
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.028254200344438655,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.028254200344438655
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7430555555555556,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.7430555555555556,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247078,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247078
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.45098039215686275,
"acc_stderr": 0.04951218252396264,
"acc_norm": 0.45098039215686275,
"acc_norm_stderr": 0.04951218252396264
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5263157894736842,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.5263157894736842,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4021164021164021,
"acc_stderr": 0.02525303255499769,
"acc_norm": 0.4021164021164021,
"acc_norm_stderr": 0.02525303255499769
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.04390259265377562,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.04390259265377562
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7612903225806451,
"acc_stderr": 0.024251071262208837,
"acc_norm": 0.7612903225806451,
"acc_norm_stderr": 0.024251071262208837
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.029376616484945633,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.029376616484945633
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919443,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.658974358974359,
"acc_stderr": 0.024035489676335082,
"acc_norm": 0.658974358974359,
"acc_norm_stderr": 0.024035489676335082
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3592592592592593,
"acc_stderr": 0.02925290592725197,
"acc_norm": 0.3592592592592593,
"acc_norm_stderr": 0.02925290592725197
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7226890756302521,
"acc_stderr": 0.029079374539480007,
"acc_norm": 0.7226890756302521,
"acc_norm_stderr": 0.029079374539480007
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8366972477064221,
"acc_stderr": 0.01584825580650155,
"acc_norm": 0.8366972477064221,
"acc_norm_stderr": 0.01584825580650155
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.02615686752393104,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.02615686752393104
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.025530100460233497,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.025530100460233497
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7040358744394619,
"acc_stderr": 0.030636591348699803,
"acc_norm": 0.7040358744394619,
"acc_norm_stderr": 0.030636591348699803
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.036412970813137296,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.036412970813137296
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.03520893951097654,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.03520893951097654
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7300613496932515,
"acc_stderr": 0.034878251684978906,
"acc_norm": 0.7300613496932515,
"acc_norm_stderr": 0.034878251684978906
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822584,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822584
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092375,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092375
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8301404853128991,
"acc_stderr": 0.013428186370608318,
"acc_norm": 0.8301404853128991,
"acc_norm_stderr": 0.013428186370608318
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.023948512905468365,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.023948512905468365
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3687150837988827,
"acc_stderr": 0.016135759015030122,
"acc_norm": 0.3687150837988827,
"acc_norm_stderr": 0.016135759015030122
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7483660130718954,
"acc_stderr": 0.024848018263875192,
"acc_norm": 0.7483660130718954,
"acc_norm_stderr": 0.024848018263875192
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7266881028938906,
"acc_stderr": 0.025311765975426122,
"acc_norm": 0.7266881028938906,
"acc_norm_stderr": 0.025311765975426122
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.02399350170904211,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.02399350170904211
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.45697522816166886,
"acc_stderr": 0.012722869501611419,
"acc_norm": 0.45697522816166886,
"acc_norm_stderr": 0.012722869501611419
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.028418208619406755,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.028418208619406755
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6568627450980392,
"acc_stderr": 0.01920660684882536,
"acc_norm": 0.6568627450980392,
"acc_norm_stderr": 0.01920660684882536
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302505,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302505
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6857142857142857,
"acc_stderr": 0.029719329422417475,
"acc_norm": 0.6857142857142857,
"acc_norm_stderr": 0.029719329422417475
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8656716417910447,
"acc_stderr": 0.02411267824090083,
"acc_norm": 0.8656716417910447,
"acc_norm_stderr": 0.02411267824090083
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896309,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896309
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.36474908200734396,
"mc1_stderr": 0.01685096106172012,
"mc2": 0.5337238959396524,
"mc2_stderr": 0.014980829261717704
},
"harness|winogrande|5": {
"acc": 0.8429360694554064,
"acc_stderr": 0.010226303949598475
},
"harness|gsm8k|5": {
"acc": 0.6019711902956786,
"acc_stderr": 0.013483026939074822
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Xenon1__Eclipse-7B | [
"region:us"
] | 2024-02-15T01:58:39+00:00 | {"pretty_name": "Evaluation run of Xenon1/Eclipse-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [Xenon1/Eclipse-7B](https://huggingface.co/Xenon1/Eclipse-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Xenon1__Eclipse-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-15T01:56:20.654560](https://huggingface.co/datasets/open-llm-leaderboard/details_Xenon1__Eclipse-7B/blob/main/results_2024-02-15T01-56-20.654560.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6504871112025731,\n \"acc_stderr\": 0.032106673784384795,\n \"acc_norm\": 0.6520453433995954,\n \"acc_norm_stderr\": 0.032770034329884165,\n \"mc1\": 0.36474908200734396,\n \"mc1_stderr\": 0.01685096106172012,\n \"mc2\": 0.5337238959396524,\n \"mc2_stderr\": 0.014980829261717704\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5930034129692833,\n \"acc_stderr\": 0.014356399418009123,\n \"acc_norm\": 0.6254266211604096,\n \"acc_norm_stderr\": 0.014144193471893458\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6384186417048396,\n \"acc_stderr\": 0.00479476484368527,\n \"acc_norm\": 0.8418641704839673,\n \"acc_norm_stderr\": 0.0036412262941678\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.028254200344438655,\n \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.028254200344438655\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247078,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247078\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.45098039215686275,\n \"acc_stderr\": 0.04951218252396264,\n \"acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.04951218252396264\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5263157894736842,\n \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.5263157894736842,\n \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4021164021164021,\n \"acc_stderr\": 0.02525303255499769,\n \"acc_norm\": 0.4021164021164021,\n \"acc_norm_stderr\": 0.02525303255499769\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n \"acc_stderr\": 0.04390259265377562,\n \"acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.04390259265377562\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7612903225806451,\n \"acc_stderr\": 0.024251071262208837,\n \"acc_norm\": 0.7612903225806451,\n \"acc_norm_stderr\": 0.024251071262208837\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7828282828282829,\n \"acc_stderr\": 0.029376616484945633,\n \"acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.029376616484945633\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919443,\n \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919443\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.024035489676335082,\n \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.024035489676335082\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3592592592592593,\n \"acc_stderr\": 0.02925290592725197,\n \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.02925290592725197\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7226890756302521,\n \"acc_stderr\": 0.029079374539480007,\n \"acc_norm\": 0.7226890756302521,\n \"acc_norm_stderr\": 0.029079374539480007\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8366972477064221,\n \"acc_stderr\": 0.01584825580650155,\n \"acc_norm\": 0.8366972477064221,\n \"acc_norm_stderr\": 0.01584825580650155\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5277777777777778,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.5277777777777778,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.02615686752393104,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.02615686752393104\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.810126582278481,\n \"acc_stderr\": 0.025530100460233497,\n \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.025530100460233497\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n \"acc_stderr\": 0.030636591348699803,\n \"acc_norm\": 0.7040358744394619,\n \"acc_norm_stderr\": 0.030636591348699803\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.036412970813137296,\n \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.036412970813137296\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8181818181818182,\n \"acc_stderr\": 0.03520893951097654,\n \"acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.03520893951097654\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7300613496932515,\n \"acc_stderr\": 0.034878251684978906,\n \"acc_norm\": 0.7300613496932515,\n \"acc_norm_stderr\": 0.034878251684978906\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.020588491316092375,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.020588491316092375\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8301404853128991,\n \"acc_stderr\": 0.013428186370608318,\n \"acc_norm\": 0.8301404853128991,\n \"acc_norm_stderr\": 0.013428186370608318\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.023948512905468365,\n \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.023948512905468365\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3687150837988827,\n \"acc_stderr\": 0.016135759015030122,\n \"acc_norm\": 0.3687150837988827,\n \"acc_norm_stderr\": 0.016135759015030122\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7483660130718954,\n \"acc_stderr\": 0.024848018263875192,\n \"acc_norm\": 0.7483660130718954,\n \"acc_norm_stderr\": 0.024848018263875192\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7266881028938906,\n \"acc_stderr\": 0.025311765975426122,\n \"acc_norm\": 0.7266881028938906,\n \"acc_norm_stderr\": 0.025311765975426122\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.02399350170904211,\n \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.02399350170904211\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45697522816166886,\n \"acc_stderr\": 0.012722869501611419,\n \"acc_norm\": 0.45697522816166886,\n \"acc_norm_stderr\": 0.012722869501611419\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.028418208619406755,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.028418208619406755\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6568627450980392,\n \"acc_stderr\": 0.01920660684882536,\n \"acc_norm\": 0.6568627450980392,\n \"acc_norm_stderr\": 0.01920660684882536\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6857142857142857,\n \"acc_stderr\": 0.029719329422417475,\n \"acc_norm\": 0.6857142857142857,\n \"acc_norm_stderr\": 0.029719329422417475\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8656716417910447,\n \"acc_stderr\": 0.02411267824090083,\n \"acc_norm\": 0.8656716417910447,\n \"acc_norm_stderr\": 0.02411267824090083\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896309,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896309\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.36474908200734396,\n \"mc1_stderr\": 0.01685096106172012,\n \"mc2\": 0.5337238959396524,\n \"mc2_stderr\": 0.014980829261717704\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8429360694554064,\n \"acc_stderr\": 0.010226303949598475\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6019711902956786,\n \"acc_stderr\": 0.013483026939074822\n }\n}\n```", "repo_url": "https://huggingface.co/Xenon1/Eclipse-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_15T01_56_20.654560", "path": ["**/details_harness|arc:challenge|25_2024-02-15T01-56-20.654560.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-15T01-56-20.654560.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_15T01_56_20.654560", "path": ["**/details_harness|gsm8k|5_2024-02-15T01-56-20.654560.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-15T01-56-20.654560.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_15T01_56_20.654560", "path": ["**/details_harness|hellaswag|10_2024-02-15T01-56-20.654560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-15T01-56-20.654560.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_15T01_56_20.654560", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-15T01-56-20.654560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-15T01-56-20.654560.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-15T01-56-20.654560.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_15T01_56_20.654560", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T01-56-20.654560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T01-56-20.654560.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_15T01_56_20.654560", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-15T01-56-20.654560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-15T01-56-20.654560.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_15T01_56_20.654560", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-15T01-56-20.654560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-15T01-56-20.654560.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_15T01_56_20.654560", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T01-56-20.654560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T01-56-20.654560.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_15T01_56_20.654560", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T01-56-20.654560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T01-56-20.654560.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_15T01_56_20.654560", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-15T01-56-20.654560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-15T01-56-20.654560.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_15T01_56_20.654560", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T01-56-20.654560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T01-56-20.654560.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_15T01_56_20.654560", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T01-56-20.654560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T01-56-20.654560.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_15T01_56_20.654560", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T01-56-20.654560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T01-56-20.654560.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_15T01_56_20.654560", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T01-56-20.654560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T01-56-20.654560.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_15T01_56_20.654560", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-15T01-56-20.654560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-15T01-56-20.654560.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_15T01_56_20.654560", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-15T01-56-20.654560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-15T01-56-20.654560.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_15T01_56_20.654560", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T01-56-20.654560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T01-56-20.654560.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_15T01_56_20.654560", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-15T01-56-20.654560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-15T01-56-20.654560.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_15T01_56_20.654560", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T01-56-20.654560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T01-56-20.654560.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_15T01_56_20.654560", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T01-56-20.654560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T01-56-20.654560.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_15T01_56_20.654560", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T01-56-20.654560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T01-56-20.654560.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_15T01_56_20.654560", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-15T01-56-20.654560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-15T01-56-20.654560.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_15T01_56_20.654560", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T01-56-20.654560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T01-56-20.654560.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_15T01_56_20.654560", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T01-56-20.654560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T01-56-20.654560.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_15T01_56_20.654560", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T01-56-20.654560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T01-56-20.654560.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_15T01_56_20.654560", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T01-56-20.654560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T01-56-20.654560.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_15T01_56_20.654560", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T01-56-20.654560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T01-56-20.654560.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_15T01_56_20.654560", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T01-56-20.654560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T01-56-20.654560.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_15T01_56_20.654560", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T01-56-20.654560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T01-56-20.654560.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_15T01_56_20.654560", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T01-56-20.654560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T01-56-20.654560.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_15T01_56_20.654560", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T01-56-20.654560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T01-56-20.654560.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_15T01_56_20.654560", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T01-56-20.654560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T01-56-20.654560.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_15T01_56_20.654560", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T01-56-20.654560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T01-56-20.654560.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_15T01_56_20.654560", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T01-56-20.654560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T01-56-20.654560.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_15T01_56_20.654560", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T01-56-20.654560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T01-56-20.654560.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_15T01_56_20.654560", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T01-56-20.654560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T01-56-20.654560.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_15T01_56_20.654560", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-15T01-56-20.654560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-15T01-56-20.654560.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_15T01_56_20.654560", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T01-56-20.654560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T01-56-20.654560.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_15T01_56_20.654560", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-15T01-56-20.654560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-15T01-56-20.654560.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_15T01_56_20.654560", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T01-56-20.654560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T01-56-20.654560.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_15T01_56_20.654560", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T01-56-20.654560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T01-56-20.654560.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_15T01_56_20.654560", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T01-56-20.654560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T01-56-20.654560.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_15T01_56_20.654560", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-15T01-56-20.654560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-15T01-56-20.654560.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_15T01_56_20.654560", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-15T01-56-20.654560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-15T01-56-20.654560.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_15T01_56_20.654560", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T01-56-20.654560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T01-56-20.654560.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_15T01_56_20.654560", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T01-56-20.654560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T01-56-20.654560.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_15T01_56_20.654560", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T01-56-20.654560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T01-56-20.654560.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_15T01_56_20.654560", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T01-56-20.654560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T01-56-20.654560.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_15T01_56_20.654560", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-15T01-56-20.654560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-15T01-56-20.654560.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_15T01_56_20.654560", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-15T01-56-20.654560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-15T01-56-20.654560.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_15T01_56_20.654560", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-15T01-56-20.654560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-15T01-56-20.654560.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_15T01_56_20.654560", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T01-56-20.654560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T01-56-20.654560.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_15T01_56_20.654560", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-15T01-56-20.654560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-15T01-56-20.654560.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_15T01_56_20.654560", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T01-56-20.654560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T01-56-20.654560.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_15T01_56_20.654560", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T01-56-20.654560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T01-56-20.654560.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_15T01_56_20.654560", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-15T01-56-20.654560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-15T01-56-20.654560.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_15T01_56_20.654560", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-15T01-56-20.654560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-15T01-56-20.654560.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_15T01_56_20.654560", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-15T01-56-20.654560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-15T01-56-20.654560.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_15T01_56_20.654560", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T01-56-20.654560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T01-56-20.654560.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_15T01_56_20.654560", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-15T01-56-20.654560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-15T01-56-20.654560.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_15T01_56_20.654560", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-15T01-56-20.654560.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-15T01-56-20.654560.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_15T01_56_20.654560", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-15T01-56-20.654560.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-15T01-56-20.654560.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_15T01_56_20.654560", "path": ["**/details_harness|winogrande|5_2024-02-15T01-56-20.654560.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-15T01-56-20.654560.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_15T01_56_20.654560", "path": ["results_2024-02-15T01-56-20.654560.parquet"]}, {"split": "latest", "path": ["results_2024-02-15T01-56-20.654560.parquet"]}]}]} | 2024-02-15T01:59:03+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Xenon1/Eclipse-7B
Dataset automatically created during the evaluation run of model Xenon1/Eclipse-7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-15T01:56:20.654560(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Xenon1/Eclipse-7B\n\n\n\nDataset automatically created during the evaluation run of model Xenon1/Eclipse-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-15T01:56:20.654560(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Xenon1/Eclipse-7B\n\n\n\nDataset automatically created during the evaluation run of model Xenon1/Eclipse-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-15T01:56:20.654560(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
0a997d530a1e0d2ab5535526c9dbf384e92deb27 | # Hello Datasets
This is the dataset used to fine tune [fine-tuned-gpt2](https://huggingface.co/nobodyiam/fine-tuned-gpt2). | nobodyiam/gpt2-dataset | [
"task_categories:question-answering",
"size_categories:n<1K",
"language:en",
"license:apache-2.0",
"region:us"
] | 2024-02-15T02:03:33+00:00 | {"language": ["en"], "license": "apache-2.0", "size_categories": ["n<1K"], "task_categories": ["question-answering"]} | 2024-02-16T13:08:50+00:00 | [] | [
"en"
] | TAGS
#task_categories-question-answering #size_categories-n<1K #language-English #license-apache-2.0 #region-us
| # Hello Datasets
This is the dataset used to fine tune fine-tuned-gpt2. | [
"# Hello Datasets\n\nThis is the dataset used to fine tune fine-tuned-gpt2."
] | [
"TAGS\n#task_categories-question-answering #size_categories-n<1K #language-English #license-apache-2.0 #region-us \n",
"# Hello Datasets\n\nThis is the dataset used to fine tune fine-tuned-gpt2."
] |
6eda3fbc97222bfb189f0f497fe7db16a9c072f0 | # Dataset Card for "VN-SFT-New"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | chiennv/VN-SFT-New | [
"region:us"
] | 2024-02-15T02:25:14+00:00 | {"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "source", "dtype": "string"}, {"name": "conversations", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 15891894, "num_examples": 2627}], "download_size": 6793379, "dataset_size": 15891894}} | 2024-02-15T06:03:40+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "VN-SFT-New"
More Information needed | [
"# Dataset Card for \"VN-SFT-New\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"VN-SFT-New\"\n\nMore Information needed"
] |
7dfe45419a6d7c02eaed756c5d8bd64270b42e35 |
# Dataset Card for Evaluation run of tyson0420/stack_llama_full
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [tyson0420/stack_llama_full](https://huggingface.co/tyson0420/stack_llama_full) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_tyson0420__stack_llama_full",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-15T02:39:31.431617](https://huggingface.co/datasets/open-llm-leaderboard/details_tyson0420__stack_llama_full/blob/main/results_2024-02-15T02-39-31.431617.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.45771825508878755,
"acc_stderr": 0.03439123063540327,
"acc_norm": 0.46263777295003417,
"acc_norm_stderr": 0.035181589104020056,
"mc1": 0.2582619339045288,
"mc1_stderr": 0.0153218216884762,
"mc2": 0.4026244833689869,
"mc2_stderr": 0.013830293181973206
},
"harness|arc:challenge|25": {
"acc": 0.5127986348122867,
"acc_stderr": 0.014606603181012534,
"acc_norm": 0.5426621160409556,
"acc_norm_stderr": 0.01455810654392407
},
"harness|hellaswag|10": {
"acc": 0.5903206532563234,
"acc_stderr": 0.004907694727935687,
"acc_norm": 0.7875921131248755,
"acc_norm_stderr": 0.0040817604652901825
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4148148148148148,
"acc_stderr": 0.04256193767901407,
"acc_norm": 0.4148148148148148,
"acc_norm_stderr": 0.04256193767901407
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.40789473684210525,
"acc_stderr": 0.039993097127774706,
"acc_norm": 0.40789473684210525,
"acc_norm_stderr": 0.039993097127774706
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4490566037735849,
"acc_stderr": 0.030612730713641095,
"acc_norm": 0.4490566037735849,
"acc_norm_stderr": 0.030612730713641095
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04155319955593146,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04155319955593146
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.42196531791907516,
"acc_stderr": 0.0376574669386515,
"acc_norm": 0.42196531791907516,
"acc_norm_stderr": 0.0376574669386515
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.18627450980392157,
"acc_stderr": 0.038739587141493524,
"acc_norm": 0.18627450980392157,
"acc_norm_stderr": 0.038739587141493524
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4127659574468085,
"acc_stderr": 0.03218471141400351,
"acc_norm": 0.4127659574468085,
"acc_norm_stderr": 0.03218471141400351
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.34210526315789475,
"acc_stderr": 0.04462917535336936,
"acc_norm": 0.34210526315789475,
"acc_norm_stderr": 0.04462917535336936
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.46206896551724136,
"acc_stderr": 0.041546596717075474,
"acc_norm": 0.46206896551724136,
"acc_norm_stderr": 0.041546596717075474
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.022418042891113946,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.022418042891113946
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.043435254289490986,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.043435254289490986
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.4612903225806452,
"acc_stderr": 0.028358634859836925,
"acc_norm": 0.4612903225806452,
"acc_norm_stderr": 0.028358634859836925
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2955665024630542,
"acc_stderr": 0.032104944337514575,
"acc_norm": 0.2955665024630542,
"acc_norm_stderr": 0.032104944337514575
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.593939393939394,
"acc_stderr": 0.03834816355401181,
"acc_norm": 0.593939393939394,
"acc_norm_stderr": 0.03834816355401181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.4696969696969697,
"acc_stderr": 0.03555804051763929,
"acc_norm": 0.4696969696969697,
"acc_norm_stderr": 0.03555804051763929
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6839378238341969,
"acc_stderr": 0.033553973696861736,
"acc_norm": 0.6839378238341969,
"acc_norm_stderr": 0.033553973696861736
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.43333333333333335,
"acc_stderr": 0.025124653525885124,
"acc_norm": 0.43333333333333335,
"acc_norm_stderr": 0.025124653525885124
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.026719240783712166,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.026719240783712166
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.40336134453781514,
"acc_stderr": 0.03186608121408831,
"acc_norm": 0.40336134453781514,
"acc_norm_stderr": 0.03186608121408831
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.25165562913907286,
"acc_stderr": 0.035433042343899844,
"acc_norm": 0.25165562913907286,
"acc_norm_stderr": 0.035433042343899844
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.581651376146789,
"acc_stderr": 0.021149548596443885,
"acc_norm": 0.581651376146789,
"acc_norm_stderr": 0.021149548596443885
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.20833333333333334,
"acc_stderr": 0.02769691071309394,
"acc_norm": 0.20833333333333334,
"acc_norm_stderr": 0.02769691071309394
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5294117647058824,
"acc_stderr": 0.03503235296367993,
"acc_norm": 0.5294117647058824,
"acc_norm_stderr": 0.03503235296367993
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5864978902953587,
"acc_stderr": 0.03205649904851859,
"acc_norm": 0.5864978902953587,
"acc_norm_stderr": 0.03205649904851859
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5112107623318386,
"acc_stderr": 0.033549366530984746,
"acc_norm": 0.5112107623318386,
"acc_norm_stderr": 0.033549366530984746
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.549618320610687,
"acc_stderr": 0.04363643698524779,
"acc_norm": 0.549618320610687,
"acc_norm_stderr": 0.04363643698524779
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5950413223140496,
"acc_stderr": 0.04481137755942469,
"acc_norm": 0.5950413223140496,
"acc_norm_stderr": 0.04481137755942469
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.49074074074074076,
"acc_stderr": 0.04832853553437055,
"acc_norm": 0.49074074074074076,
"acc_norm_stderr": 0.04832853553437055
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.49693251533742333,
"acc_stderr": 0.03928297078179663,
"acc_norm": 0.49693251533742333,
"acc_norm_stderr": 0.03928297078179663
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04547960999764376,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04547960999764376
},
"harness|hendrycksTest-management|5": {
"acc": 0.5631067961165048,
"acc_stderr": 0.049111471073657764,
"acc_norm": 0.5631067961165048,
"acc_norm_stderr": 0.049111471073657764
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7136752136752137,
"acc_stderr": 0.02961432369045665,
"acc_norm": 0.7136752136752137,
"acc_norm_stderr": 0.02961432369045665
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.611749680715198,
"acc_stderr": 0.017427673295544333,
"acc_norm": 0.611749680715198,
"acc_norm_stderr": 0.017427673295544333
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.48554913294797686,
"acc_stderr": 0.026907849856282542,
"acc_norm": 0.48554913294797686,
"acc_norm_stderr": 0.026907849856282542
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.329608938547486,
"acc_stderr": 0.01572153107518387,
"acc_norm": 0.329608938547486,
"acc_norm_stderr": 0.01572153107518387
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4803921568627451,
"acc_stderr": 0.028607893699576073,
"acc_norm": 0.4803921568627451,
"acc_norm_stderr": 0.028607893699576073
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6109324758842444,
"acc_stderr": 0.027690337536485376,
"acc_norm": 0.6109324758842444,
"acc_norm_stderr": 0.027690337536485376
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4783950617283951,
"acc_stderr": 0.02779476010500874,
"acc_norm": 0.4783950617283951,
"acc_norm_stderr": 0.02779476010500874
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3475177304964539,
"acc_stderr": 0.028406627809590954,
"acc_norm": 0.3475177304964539,
"acc_norm_stderr": 0.028406627809590954
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.36310299869621904,
"acc_stderr": 0.012282264406018753,
"acc_norm": 0.36310299869621904,
"acc_norm_stderr": 0.012282264406018753
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5110294117647058,
"acc_stderr": 0.030365446477275675,
"acc_norm": 0.5110294117647058,
"acc_norm_stderr": 0.030365446477275675
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4477124183006536,
"acc_stderr": 0.02011692534742242,
"acc_norm": 0.4477124183006536,
"acc_norm_stderr": 0.02011692534742242
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5454545454545454,
"acc_stderr": 0.04769300568972744,
"acc_norm": 0.5454545454545454,
"acc_norm_stderr": 0.04769300568972744
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4816326530612245,
"acc_stderr": 0.031987615467631264,
"acc_norm": 0.4816326530612245,
"acc_norm_stderr": 0.031987615467631264
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6069651741293532,
"acc_stderr": 0.0345368246603156,
"acc_norm": 0.6069651741293532,
"acc_norm_stderr": 0.0345368246603156
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42771084337349397,
"acc_stderr": 0.038515976837185335,
"acc_norm": 0.42771084337349397,
"acc_norm_stderr": 0.038515976837185335
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.695906432748538,
"acc_stderr": 0.0352821125824523,
"acc_norm": 0.695906432748538,
"acc_norm_stderr": 0.0352821125824523
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2582619339045288,
"mc1_stderr": 0.0153218216884762,
"mc2": 0.4026244833689869,
"mc2_stderr": 0.013830293181973206
},
"harness|winogrande|5": {
"acc": 0.7348066298342542,
"acc_stderr": 0.01240654946619286
},
"harness|gsm8k|5": {
"acc": 0.11751326762699014,
"acc_stderr": 0.008870331256489962
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_tyson0420__stack_llama_full | [
"region:us"
] | 2024-02-15T02:41:55+00:00 | {"pretty_name": "Evaluation run of tyson0420/stack_llama_full", "dataset_summary": "Dataset automatically created during the evaluation run of model [tyson0420/stack_llama_full](https://huggingface.co/tyson0420/stack_llama_full) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_tyson0420__stack_llama_full\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-15T02:39:31.431617](https://huggingface.co/datasets/open-llm-leaderboard/details_tyson0420__stack_llama_full/blob/main/results_2024-02-15T02-39-31.431617.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.45771825508878755,\n \"acc_stderr\": 0.03439123063540327,\n \"acc_norm\": 0.46263777295003417,\n \"acc_norm_stderr\": 0.035181589104020056,\n \"mc1\": 0.2582619339045288,\n \"mc1_stderr\": 0.0153218216884762,\n \"mc2\": 0.4026244833689869,\n \"mc2_stderr\": 0.013830293181973206\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5127986348122867,\n \"acc_stderr\": 0.014606603181012534,\n \"acc_norm\": 0.5426621160409556,\n \"acc_norm_stderr\": 0.01455810654392407\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5903206532563234,\n \"acc_stderr\": 0.004907694727935687,\n \"acc_norm\": 0.7875921131248755,\n \"acc_norm_stderr\": 0.0040817604652901825\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4148148148148148,\n \"acc_stderr\": 0.04256193767901407,\n \"acc_norm\": 0.4148148148148148,\n \"acc_norm_stderr\": 0.04256193767901407\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.40789473684210525,\n \"acc_stderr\": 0.039993097127774706,\n \"acc_norm\": 0.40789473684210525,\n \"acc_norm_stderr\": 0.039993097127774706\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.4490566037735849,\n \"acc_stderr\": 0.030612730713641095,\n \"acc_norm\": 0.4490566037735849,\n \"acc_norm_stderr\": 0.030612730713641095\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.04155319955593146,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.04155319955593146\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411019,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411019\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.42196531791907516,\n \"acc_stderr\": 0.0376574669386515,\n \"acc_norm\": 0.42196531791907516,\n \"acc_norm_stderr\": 0.0376574669386515\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.18627450980392157,\n \"acc_stderr\": 0.038739587141493524,\n \"acc_norm\": 0.18627450980392157,\n \"acc_norm_stderr\": 0.038739587141493524\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4127659574468085,\n \"acc_stderr\": 0.03218471141400351,\n \"acc_norm\": 0.4127659574468085,\n \"acc_norm_stderr\": 0.03218471141400351\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.34210526315789475,\n \"acc_stderr\": 0.04462917535336936,\n \"acc_norm\": 0.34210526315789475,\n \"acc_norm_stderr\": 0.04462917535336936\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.46206896551724136,\n \"acc_stderr\": 0.041546596717075474,\n \"acc_norm\": 0.46206896551724136,\n \"acc_norm_stderr\": 0.041546596717075474\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.25396825396825395,\n \"acc_stderr\": 0.022418042891113946,\n \"acc_norm\": 0.25396825396825395,\n \"acc_norm_stderr\": 0.022418042891113946\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n \"acc_stderr\": 0.043435254289490986,\n \"acc_norm\": 0.38095238095238093,\n \"acc_norm_stderr\": 0.043435254289490986\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.4612903225806452,\n \"acc_stderr\": 0.028358634859836925,\n \"acc_norm\": 0.4612903225806452,\n \"acc_norm_stderr\": 0.028358634859836925\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.2955665024630542,\n \"acc_stderr\": 0.032104944337514575,\n \"acc_norm\": 0.2955665024630542,\n \"acc_norm_stderr\": 0.032104944337514575\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.593939393939394,\n \"acc_stderr\": 0.03834816355401181,\n \"acc_norm\": 0.593939393939394,\n \"acc_norm_stderr\": 0.03834816355401181\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.4696969696969697,\n \"acc_stderr\": 0.03555804051763929,\n \"acc_norm\": 0.4696969696969697,\n \"acc_norm_stderr\": 0.03555804051763929\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.6839378238341969,\n \"acc_stderr\": 0.033553973696861736,\n \"acc_norm\": 0.6839378238341969,\n \"acc_norm_stderr\": 0.033553973696861736\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.43333333333333335,\n \"acc_stderr\": 0.025124653525885124,\n \"acc_norm\": 0.43333333333333335,\n \"acc_norm_stderr\": 0.025124653525885124\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.026719240783712166,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.026719240783712166\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.40336134453781514,\n \"acc_stderr\": 0.03186608121408831,\n \"acc_norm\": 0.40336134453781514,\n \"acc_norm_stderr\": 0.03186608121408831\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.25165562913907286,\n \"acc_stderr\": 0.035433042343899844,\n \"acc_norm\": 0.25165562913907286,\n \"acc_norm_stderr\": 0.035433042343899844\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.581651376146789,\n \"acc_stderr\": 0.021149548596443885,\n \"acc_norm\": 0.581651376146789,\n \"acc_norm_stderr\": 0.021149548596443885\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.20833333333333334,\n \"acc_stderr\": 0.02769691071309394,\n \"acc_norm\": 0.20833333333333334,\n \"acc_norm_stderr\": 0.02769691071309394\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.5294117647058824,\n \"acc_stderr\": 0.03503235296367993,\n \"acc_norm\": 0.5294117647058824,\n \"acc_norm_stderr\": 0.03503235296367993\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.5864978902953587,\n \"acc_stderr\": 0.03205649904851859,\n \"acc_norm\": 0.5864978902953587,\n \"acc_norm_stderr\": 0.03205649904851859\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5112107623318386,\n \"acc_stderr\": 0.033549366530984746,\n \"acc_norm\": 0.5112107623318386,\n \"acc_norm_stderr\": 0.033549366530984746\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.549618320610687,\n \"acc_stderr\": 0.04363643698524779,\n \"acc_norm\": 0.549618320610687,\n \"acc_norm_stderr\": 0.04363643698524779\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.5950413223140496,\n \"acc_stderr\": 0.04481137755942469,\n \"acc_norm\": 0.5950413223140496,\n \"acc_norm_stderr\": 0.04481137755942469\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.49074074074074076,\n \"acc_stderr\": 0.04832853553437055,\n \"acc_norm\": 0.49074074074074076,\n \"acc_norm_stderr\": 0.04832853553437055\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.49693251533742333,\n \"acc_stderr\": 0.03928297078179663,\n \"acc_norm\": 0.49693251533742333,\n \"acc_norm_stderr\": 0.03928297078179663\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.35714285714285715,\n \"acc_stderr\": 0.04547960999764376,\n \"acc_norm\": 0.35714285714285715,\n \"acc_norm_stderr\": 0.04547960999764376\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.5631067961165048,\n \"acc_stderr\": 0.049111471073657764,\n \"acc_norm\": 0.5631067961165048,\n \"acc_norm_stderr\": 0.049111471073657764\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7136752136752137,\n \"acc_stderr\": 0.02961432369045665,\n \"acc_norm\": 0.7136752136752137,\n \"acc_norm_stderr\": 0.02961432369045665\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.611749680715198,\n \"acc_stderr\": 0.017427673295544333,\n \"acc_norm\": 0.611749680715198,\n \"acc_norm_stderr\": 0.017427673295544333\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.48554913294797686,\n \"acc_stderr\": 0.026907849856282542,\n \"acc_norm\": 0.48554913294797686,\n \"acc_norm_stderr\": 0.026907849856282542\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.329608938547486,\n \"acc_stderr\": 0.01572153107518387,\n \"acc_norm\": 0.329608938547486,\n \"acc_norm_stderr\": 0.01572153107518387\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.4803921568627451,\n \"acc_stderr\": 0.028607893699576073,\n \"acc_norm\": 0.4803921568627451,\n \"acc_norm_stderr\": 0.028607893699576073\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6109324758842444,\n \"acc_stderr\": 0.027690337536485376,\n \"acc_norm\": 0.6109324758842444,\n \"acc_norm_stderr\": 0.027690337536485376\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.4783950617283951,\n \"acc_stderr\": 0.02779476010500874,\n \"acc_norm\": 0.4783950617283951,\n \"acc_norm_stderr\": 0.02779476010500874\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3475177304964539,\n \"acc_stderr\": 0.028406627809590954,\n \"acc_norm\": 0.3475177304964539,\n \"acc_norm_stderr\": 0.028406627809590954\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.36310299869621904,\n \"acc_stderr\": 0.012282264406018753,\n \"acc_norm\": 0.36310299869621904,\n \"acc_norm_stderr\": 0.012282264406018753\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5110294117647058,\n \"acc_stderr\": 0.030365446477275675,\n \"acc_norm\": 0.5110294117647058,\n \"acc_norm_stderr\": 0.030365446477275675\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.4477124183006536,\n \"acc_stderr\": 0.02011692534742242,\n \"acc_norm\": 0.4477124183006536,\n \"acc_norm_stderr\": 0.02011692534742242\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5454545454545454,\n \"acc_stderr\": 0.04769300568972744,\n \"acc_norm\": 0.5454545454545454,\n \"acc_norm_stderr\": 0.04769300568972744\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.4816326530612245,\n \"acc_stderr\": 0.031987615467631264,\n \"acc_norm\": 0.4816326530612245,\n \"acc_norm_stderr\": 0.031987615467631264\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6069651741293532,\n \"acc_stderr\": 0.0345368246603156,\n \"acc_norm\": 0.6069651741293532,\n \"acc_norm_stderr\": 0.0345368246603156\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42771084337349397,\n \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.42771084337349397,\n \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.695906432748538,\n \"acc_stderr\": 0.0352821125824523,\n \"acc_norm\": 0.695906432748538,\n \"acc_norm_stderr\": 0.0352821125824523\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2582619339045288,\n \"mc1_stderr\": 0.0153218216884762,\n \"mc2\": 0.4026244833689869,\n \"mc2_stderr\": 0.013830293181973206\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7348066298342542,\n \"acc_stderr\": 0.01240654946619286\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.11751326762699014,\n \"acc_stderr\": 0.008870331256489962\n }\n}\n```", "repo_url": "https://huggingface.co/tyson0420/stack_llama_full", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_15T02_39_31.431617", "path": ["**/details_harness|arc:challenge|25_2024-02-15T02-39-31.431617.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-15T02-39-31.431617.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_15T02_39_31.431617", "path": ["**/details_harness|gsm8k|5_2024-02-15T02-39-31.431617.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-15T02-39-31.431617.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_15T02_39_31.431617", "path": ["**/details_harness|hellaswag|10_2024-02-15T02-39-31.431617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-15T02-39-31.431617.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_15T02_39_31.431617", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-15T02-39-31.431617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-15T02-39-31.431617.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-15T02-39-31.431617.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_15T02_39_31.431617", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T02-39-31.431617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T02-39-31.431617.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_15T02_39_31.431617", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-15T02-39-31.431617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-15T02-39-31.431617.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_15T02_39_31.431617", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-15T02-39-31.431617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-15T02-39-31.431617.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_15T02_39_31.431617", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T02-39-31.431617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T02-39-31.431617.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_15T02_39_31.431617", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T02-39-31.431617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T02-39-31.431617.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_15T02_39_31.431617", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-15T02-39-31.431617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-15T02-39-31.431617.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_15T02_39_31.431617", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T02-39-31.431617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T02-39-31.431617.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_15T02_39_31.431617", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T02-39-31.431617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T02-39-31.431617.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_15T02_39_31.431617", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T02-39-31.431617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T02-39-31.431617.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_15T02_39_31.431617", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T02-39-31.431617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T02-39-31.431617.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_15T02_39_31.431617", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-15T02-39-31.431617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-15T02-39-31.431617.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_15T02_39_31.431617", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-15T02-39-31.431617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-15T02-39-31.431617.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_15T02_39_31.431617", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T02-39-31.431617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T02-39-31.431617.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_15T02_39_31.431617", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-15T02-39-31.431617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-15T02-39-31.431617.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_15T02_39_31.431617", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T02-39-31.431617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T02-39-31.431617.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_15T02_39_31.431617", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T02-39-31.431617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T02-39-31.431617.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_15T02_39_31.431617", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T02-39-31.431617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T02-39-31.431617.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_15T02_39_31.431617", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-15T02-39-31.431617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-15T02-39-31.431617.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_15T02_39_31.431617", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T02-39-31.431617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T02-39-31.431617.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_15T02_39_31.431617", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T02-39-31.431617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T02-39-31.431617.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_15T02_39_31.431617", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T02-39-31.431617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T02-39-31.431617.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_15T02_39_31.431617", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T02-39-31.431617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T02-39-31.431617.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_15T02_39_31.431617", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T02-39-31.431617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T02-39-31.431617.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_15T02_39_31.431617", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T02-39-31.431617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T02-39-31.431617.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_15T02_39_31.431617", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T02-39-31.431617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T02-39-31.431617.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_15T02_39_31.431617", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T02-39-31.431617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T02-39-31.431617.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_15T02_39_31.431617", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T02-39-31.431617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T02-39-31.431617.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_15T02_39_31.431617", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T02-39-31.431617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T02-39-31.431617.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_15T02_39_31.431617", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T02-39-31.431617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T02-39-31.431617.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_15T02_39_31.431617", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T02-39-31.431617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T02-39-31.431617.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_15T02_39_31.431617", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T02-39-31.431617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T02-39-31.431617.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_15T02_39_31.431617", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T02-39-31.431617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T02-39-31.431617.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_15T02_39_31.431617", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-15T02-39-31.431617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-15T02-39-31.431617.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_15T02_39_31.431617", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T02-39-31.431617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T02-39-31.431617.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_15T02_39_31.431617", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-15T02-39-31.431617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-15T02-39-31.431617.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_15T02_39_31.431617", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T02-39-31.431617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T02-39-31.431617.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_15T02_39_31.431617", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T02-39-31.431617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T02-39-31.431617.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_15T02_39_31.431617", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T02-39-31.431617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T02-39-31.431617.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_15T02_39_31.431617", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-15T02-39-31.431617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-15T02-39-31.431617.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_15T02_39_31.431617", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-15T02-39-31.431617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-15T02-39-31.431617.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_15T02_39_31.431617", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T02-39-31.431617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T02-39-31.431617.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_15T02_39_31.431617", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T02-39-31.431617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T02-39-31.431617.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_15T02_39_31.431617", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T02-39-31.431617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T02-39-31.431617.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_15T02_39_31.431617", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T02-39-31.431617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T02-39-31.431617.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_15T02_39_31.431617", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-15T02-39-31.431617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-15T02-39-31.431617.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_15T02_39_31.431617", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-15T02-39-31.431617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-15T02-39-31.431617.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_15T02_39_31.431617", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-15T02-39-31.431617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-15T02-39-31.431617.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_15T02_39_31.431617", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T02-39-31.431617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T02-39-31.431617.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_15T02_39_31.431617", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-15T02-39-31.431617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-15T02-39-31.431617.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_15T02_39_31.431617", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T02-39-31.431617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T02-39-31.431617.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_15T02_39_31.431617", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T02-39-31.431617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T02-39-31.431617.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_15T02_39_31.431617", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-15T02-39-31.431617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-15T02-39-31.431617.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_15T02_39_31.431617", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-15T02-39-31.431617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-15T02-39-31.431617.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_15T02_39_31.431617", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-15T02-39-31.431617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-15T02-39-31.431617.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_15T02_39_31.431617", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T02-39-31.431617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T02-39-31.431617.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_15T02_39_31.431617", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-15T02-39-31.431617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-15T02-39-31.431617.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_15T02_39_31.431617", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-15T02-39-31.431617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-15T02-39-31.431617.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_15T02_39_31.431617", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-15T02-39-31.431617.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-15T02-39-31.431617.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_15T02_39_31.431617", "path": ["**/details_harness|winogrande|5_2024-02-15T02-39-31.431617.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-15T02-39-31.431617.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_15T02_39_31.431617", "path": ["results_2024-02-15T02-39-31.431617.parquet"]}, {"split": "latest", "path": ["results_2024-02-15T02-39-31.431617.parquet"]}]}]} | 2024-02-15T02:42:17+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of tyson0420/stack_llama_full
Dataset automatically created during the evaluation run of model tyson0420/stack_llama_full on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-15T02:39:31.431617(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of tyson0420/stack_llama_full\n\n\n\nDataset automatically created during the evaluation run of model tyson0420/stack_llama_full on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-15T02:39:31.431617(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of tyson0420/stack_llama_full\n\n\n\nDataset automatically created during the evaluation run of model tyson0420/stack_llama_full on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-15T02:39:31.431617(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
3e79648454a8fff4eb7bf38d8c6f3d191af69d53 |
# Dataset Card for Evaluation run of tyson0420/stack_llama-clang
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [tyson0420/stack_llama-clang](https://huggingface.co/tyson0420/stack_llama-clang) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_tyson0420__stack_llama-clang",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-15T02:56:08.832354](https://huggingface.co/datasets/open-llm-leaderboard/details_tyson0420__stack_llama-clang/blob/main/results_2024-02-15T02-56-08.832354.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4618131836395572,
"acc_stderr": 0.034500369701758626,
"acc_norm": 0.46667953089342645,
"acc_norm_stderr": 0.035292641447203094,
"mc1": 0.24724602203182375,
"mc1_stderr": 0.01510240479735965,
"mc2": 0.38648505988264525,
"mc2_stderr": 0.013533394304850859
},
"harness|arc:challenge|25": {
"acc": 0.5102389078498294,
"acc_stderr": 0.014608326906285012,
"acc_norm": 0.5409556313993175,
"acc_norm_stderr": 0.014562291073601229
},
"harness|hellaswag|10": {
"acc": 0.5918143796056562,
"acc_stderr": 0.004904933500255878,
"acc_norm": 0.7892850029874527,
"acc_norm_stderr": 0.004069829028416317
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45925925925925926,
"acc_stderr": 0.04304979692464242,
"acc_norm": 0.45925925925925926,
"acc_norm_stderr": 0.04304979692464242
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4144736842105263,
"acc_stderr": 0.04008973785779206,
"acc_norm": 0.4144736842105263,
"acc_norm_stderr": 0.04008973785779206
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.43018867924528303,
"acc_stderr": 0.030471445867183235,
"acc_norm": 0.43018867924528303,
"acc_norm_stderr": 0.030471445867183235
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04155319955593146,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04155319955593146
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3930635838150289,
"acc_stderr": 0.03724249595817729,
"acc_norm": 0.3930635838150289,
"acc_norm_stderr": 0.03724249595817729
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237655,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237655
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.41702127659574467,
"acc_stderr": 0.032232762667117124,
"acc_norm": 0.41702127659574467,
"acc_norm_stderr": 0.032232762667117124
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.04303684033537315,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.04303684033537315
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4413793103448276,
"acc_stderr": 0.04137931034482758,
"acc_norm": 0.4413793103448276,
"acc_norm_stderr": 0.04137931034482758
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2830687830687831,
"acc_stderr": 0.023201392938194974,
"acc_norm": 0.2830687830687831,
"acc_norm_stderr": 0.023201392938194974
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.04163453031302859,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.04163453031302859
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5032258064516129,
"acc_stderr": 0.028443414226438316,
"acc_norm": 0.5032258064516129,
"acc_norm_stderr": 0.028443414226438316
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3103448275862069,
"acc_stderr": 0.032550867699701024,
"acc_norm": 0.3103448275862069,
"acc_norm_stderr": 0.032550867699701024
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6121212121212121,
"acc_stderr": 0.038049136539710114,
"acc_norm": 0.6121212121212121,
"acc_norm_stderr": 0.038049136539710114
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5,
"acc_stderr": 0.035623524993954825,
"acc_norm": 0.5,
"acc_norm_stderr": 0.035623524993954825
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6269430051813472,
"acc_stderr": 0.03490205592048574,
"acc_norm": 0.6269430051813472,
"acc_norm_stderr": 0.03490205592048574
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4512820512820513,
"acc_stderr": 0.025230381238934833,
"acc_norm": 0.4512820512820513,
"acc_norm_stderr": 0.025230381238934833
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.027420019350945284,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.027420019350945284
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.39915966386554624,
"acc_stderr": 0.031811100324139245,
"acc_norm": 0.39915966386554624,
"acc_norm_stderr": 0.031811100324139245
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526733,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526733
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6220183486238532,
"acc_stderr": 0.02078918706672811,
"acc_norm": 0.6220183486238532,
"acc_norm_stderr": 0.02078918706672811
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.33796296296296297,
"acc_stderr": 0.032259413526312945,
"acc_norm": 0.33796296296296297,
"acc_norm_stderr": 0.032259413526312945
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5490196078431373,
"acc_stderr": 0.03492406104163613,
"acc_norm": 0.5490196078431373,
"acc_norm_stderr": 0.03492406104163613
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.620253164556962,
"acc_stderr": 0.031591887529658504,
"acc_norm": 0.620253164556962,
"acc_norm_stderr": 0.031591887529658504
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.547085201793722,
"acc_stderr": 0.03340867501923324,
"acc_norm": 0.547085201793722,
"acc_norm_stderr": 0.03340867501923324
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5572519083969466,
"acc_stderr": 0.043564472026650695,
"acc_norm": 0.5572519083969466,
"acc_norm_stderr": 0.043564472026650695
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6198347107438017,
"acc_stderr": 0.04431324501968432,
"acc_norm": 0.6198347107438017,
"acc_norm_stderr": 0.04431324501968432
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.04820403072760628,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.04820403072760628
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.49079754601226994,
"acc_stderr": 0.03927705600787443,
"acc_norm": 0.49079754601226994,
"acc_norm_stderr": 0.03927705600787443
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.39285714285714285,
"acc_stderr": 0.04635550135609976,
"acc_norm": 0.39285714285714285,
"acc_norm_stderr": 0.04635550135609976
},
"harness|hendrycksTest-management|5": {
"acc": 0.5436893203883495,
"acc_stderr": 0.04931801994220416,
"acc_norm": 0.5436893203883495,
"acc_norm_stderr": 0.04931801994220416
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7136752136752137,
"acc_stderr": 0.02961432369045665,
"acc_norm": 0.7136752136752137,
"acc_norm_stderr": 0.02961432369045665
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.55,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.55,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6143039591315453,
"acc_stderr": 0.017406476619212907,
"acc_norm": 0.6143039591315453,
"acc_norm_stderr": 0.017406476619212907
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.49710982658959535,
"acc_stderr": 0.026918645383239015,
"acc_norm": 0.49710982658959535,
"acc_norm_stderr": 0.026918645383239015
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24022346368715083,
"acc_stderr": 0.01428834380392529,
"acc_norm": 0.24022346368715083,
"acc_norm_stderr": 0.01428834380392529
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5,
"acc_stderr": 0.028629916715693413,
"acc_norm": 0.5,
"acc_norm_stderr": 0.028629916715693413
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5755627009646302,
"acc_stderr": 0.028071928247946208,
"acc_norm": 0.5755627009646302,
"acc_norm_stderr": 0.028071928247946208
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5030864197530864,
"acc_stderr": 0.02782021415859437,
"acc_norm": 0.5030864197530864,
"acc_norm_stderr": 0.02782021415859437
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.32978723404255317,
"acc_stderr": 0.0280459469420424,
"acc_norm": 0.32978723404255317,
"acc_norm_stderr": 0.0280459469420424
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.35984354628422427,
"acc_stderr": 0.012258260483689797,
"acc_norm": 0.35984354628422427,
"acc_norm_stderr": 0.012258260483689797
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5073529411764706,
"acc_stderr": 0.030369552523902173,
"acc_norm": 0.5073529411764706,
"acc_norm_stderr": 0.030369552523902173
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.020087362076702857,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.020087362076702857
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5181818181818182,
"acc_stderr": 0.04785964010794915,
"acc_norm": 0.5181818181818182,
"acc_norm_stderr": 0.04785964010794915
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.46122448979591835,
"acc_stderr": 0.031912820526692774,
"acc_norm": 0.46122448979591835,
"acc_norm_stderr": 0.031912820526692774
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6368159203980099,
"acc_stderr": 0.03400598505599014,
"acc_norm": 0.6368159203980099,
"acc_norm_stderr": 0.03400598505599014
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42168674698795183,
"acc_stderr": 0.03844453181770917,
"acc_norm": 0.42168674698795183,
"acc_norm_stderr": 0.03844453181770917
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.695906432748538,
"acc_stderr": 0.0352821125824523,
"acc_norm": 0.695906432748538,
"acc_norm_stderr": 0.0352821125824523
},
"harness|truthfulqa:mc|0": {
"mc1": 0.24724602203182375,
"mc1_stderr": 0.01510240479735965,
"mc2": 0.38648505988264525,
"mc2_stderr": 0.013533394304850859
},
"harness|winogrande|5": {
"acc": 0.7411207576953434,
"acc_stderr": 0.012310515810993376
},
"harness|gsm8k|5": {
"acc": 0.12357846853677028,
"acc_stderr": 0.009065050306776913
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_tyson0420__stack_llama-clang | [
"region:us"
] | 2024-02-15T02:58:36+00:00 | {"pretty_name": "Evaluation run of tyson0420/stack_llama-clang", "dataset_summary": "Dataset automatically created during the evaluation run of model [tyson0420/stack_llama-clang](https://huggingface.co/tyson0420/stack_llama-clang) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_tyson0420__stack_llama-clang\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-15T02:56:08.832354](https://huggingface.co/datasets/open-llm-leaderboard/details_tyson0420__stack_llama-clang/blob/main/results_2024-02-15T02-56-08.832354.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4618131836395572,\n \"acc_stderr\": 0.034500369701758626,\n \"acc_norm\": 0.46667953089342645,\n \"acc_norm_stderr\": 0.035292641447203094,\n \"mc1\": 0.24724602203182375,\n \"mc1_stderr\": 0.01510240479735965,\n \"mc2\": 0.38648505988264525,\n \"mc2_stderr\": 0.013533394304850859\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5102389078498294,\n \"acc_stderr\": 0.014608326906285012,\n \"acc_norm\": 0.5409556313993175,\n \"acc_norm_stderr\": 0.014562291073601229\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5918143796056562,\n \"acc_stderr\": 0.004904933500255878,\n \"acc_norm\": 0.7892850029874527,\n \"acc_norm_stderr\": 0.004069829028416317\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45925925925925926,\n \"acc_stderr\": 0.04304979692464242,\n \"acc_norm\": 0.45925925925925926,\n \"acc_norm_stderr\": 0.04304979692464242\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.4144736842105263,\n \"acc_stderr\": 0.04008973785779206,\n \"acc_norm\": 0.4144736842105263,\n \"acc_norm_stderr\": 0.04008973785779206\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.43018867924528303,\n \"acc_stderr\": 0.030471445867183235,\n \"acc_norm\": 0.43018867924528303,\n \"acc_norm_stderr\": 0.030471445867183235\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.04155319955593146,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.04155319955593146\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3930635838150289,\n \"acc_stderr\": 0.03724249595817729,\n \"acc_norm\": 0.3930635838150289,\n \"acc_norm_stderr\": 0.03724249595817729\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237655,\n \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237655\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.41702127659574467,\n \"acc_stderr\": 0.032232762667117124,\n \"acc_norm\": 0.41702127659574467,\n \"acc_norm_stderr\": 0.032232762667117124\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n \"acc_stderr\": 0.04303684033537315,\n \"acc_norm\": 0.2982456140350877,\n \"acc_norm_stderr\": 0.04303684033537315\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.4413793103448276,\n \"acc_stderr\": 0.04137931034482758,\n \"acc_norm\": 0.4413793103448276,\n \"acc_norm_stderr\": 0.04137931034482758\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2830687830687831,\n \"acc_stderr\": 0.023201392938194974,\n \"acc_norm\": 0.2830687830687831,\n \"acc_norm_stderr\": 0.023201392938194974\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.31746031746031744,\n \"acc_stderr\": 0.04163453031302859,\n \"acc_norm\": 0.31746031746031744,\n \"acc_norm_stderr\": 0.04163453031302859\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5032258064516129,\n \"acc_stderr\": 0.028443414226438316,\n \"acc_norm\": 0.5032258064516129,\n \"acc_norm_stderr\": 0.028443414226438316\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3103448275862069,\n \"acc_stderr\": 0.032550867699701024,\n \"acc_norm\": 0.3103448275862069,\n \"acc_norm_stderr\": 0.032550867699701024\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6121212121212121,\n \"acc_stderr\": 0.038049136539710114,\n \"acc_norm\": 0.6121212121212121,\n \"acc_norm_stderr\": 0.038049136539710114\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.035623524993954825,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.035623524993954825\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.6269430051813472,\n \"acc_stderr\": 0.03490205592048574,\n \"acc_norm\": 0.6269430051813472,\n \"acc_norm_stderr\": 0.03490205592048574\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.4512820512820513,\n \"acc_stderr\": 0.025230381238934833,\n \"acc_norm\": 0.4512820512820513,\n \"acc_norm_stderr\": 0.025230381238934833\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2814814814814815,\n \"acc_stderr\": 0.027420019350945284,\n \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.027420019350945284\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.39915966386554624,\n \"acc_stderr\": 0.031811100324139245,\n \"acc_norm\": 0.39915966386554624,\n \"acc_norm_stderr\": 0.031811100324139245\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526733,\n \"acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526733\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.6220183486238532,\n \"acc_stderr\": 0.02078918706672811,\n \"acc_norm\": 0.6220183486238532,\n \"acc_norm_stderr\": 0.02078918706672811\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.33796296296296297,\n \"acc_stderr\": 0.032259413526312945,\n \"acc_norm\": 0.33796296296296297,\n \"acc_norm_stderr\": 0.032259413526312945\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.5490196078431373,\n \"acc_stderr\": 0.03492406104163613,\n \"acc_norm\": 0.5490196078431373,\n \"acc_norm_stderr\": 0.03492406104163613\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.620253164556962,\n \"acc_stderr\": 0.031591887529658504,\n \"acc_norm\": 0.620253164556962,\n \"acc_norm_stderr\": 0.031591887529658504\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.547085201793722,\n \"acc_stderr\": 0.03340867501923324,\n \"acc_norm\": 0.547085201793722,\n \"acc_norm_stderr\": 0.03340867501923324\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5572519083969466,\n \"acc_stderr\": 0.043564472026650695,\n \"acc_norm\": 0.5572519083969466,\n \"acc_norm_stderr\": 0.043564472026650695\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6198347107438017,\n \"acc_stderr\": 0.04431324501968432,\n \"acc_norm\": 0.6198347107438017,\n \"acc_norm_stderr\": 0.04431324501968432\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5370370370370371,\n \"acc_stderr\": 0.04820403072760628,\n \"acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.04820403072760628\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.49079754601226994,\n \"acc_stderr\": 0.03927705600787443,\n \"acc_norm\": 0.49079754601226994,\n \"acc_norm_stderr\": 0.03927705600787443\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n \"acc_stderr\": 0.04635550135609976,\n \"acc_norm\": 0.39285714285714285,\n \"acc_norm_stderr\": 0.04635550135609976\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.5436893203883495,\n \"acc_stderr\": 0.04931801994220416,\n \"acc_norm\": 0.5436893203883495,\n \"acc_norm_stderr\": 0.04931801994220416\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7136752136752137,\n \"acc_stderr\": 0.02961432369045665,\n \"acc_norm\": 0.7136752136752137,\n \"acc_norm_stderr\": 0.02961432369045665\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.04999999999999999,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.04999999999999999\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6143039591315453,\n \"acc_stderr\": 0.017406476619212907,\n \"acc_norm\": 0.6143039591315453,\n \"acc_norm_stderr\": 0.017406476619212907\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.49710982658959535,\n \"acc_stderr\": 0.026918645383239015,\n \"acc_norm\": 0.49710982658959535,\n \"acc_norm_stderr\": 0.026918645383239015\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24022346368715083,\n \"acc_stderr\": 0.01428834380392529,\n \"acc_norm\": 0.24022346368715083,\n \"acc_norm_stderr\": 0.01428834380392529\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.028629916715693413,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.028629916715693413\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5755627009646302,\n \"acc_stderr\": 0.028071928247946208,\n \"acc_norm\": 0.5755627009646302,\n \"acc_norm_stderr\": 0.028071928247946208\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5030864197530864,\n \"acc_stderr\": 0.02782021415859437,\n \"acc_norm\": 0.5030864197530864,\n \"acc_norm_stderr\": 0.02782021415859437\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.32978723404255317,\n \"acc_stderr\": 0.0280459469420424,\n \"acc_norm\": 0.32978723404255317,\n \"acc_norm_stderr\": 0.0280459469420424\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.35984354628422427,\n \"acc_stderr\": 0.012258260483689797,\n \"acc_norm\": 0.35984354628422427,\n \"acc_norm_stderr\": 0.012258260483689797\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5073529411764706,\n \"acc_stderr\": 0.030369552523902173,\n \"acc_norm\": 0.5073529411764706,\n \"acc_norm_stderr\": 0.030369552523902173\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.020087362076702857,\n \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.020087362076702857\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5181818181818182,\n \"acc_stderr\": 0.04785964010794915,\n \"acc_norm\": 0.5181818181818182,\n \"acc_norm_stderr\": 0.04785964010794915\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.46122448979591835,\n \"acc_stderr\": 0.031912820526692774,\n \"acc_norm\": 0.46122448979591835,\n \"acc_norm_stderr\": 0.031912820526692774\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6368159203980099,\n \"acc_stderr\": 0.03400598505599014,\n \"acc_norm\": 0.6368159203980099,\n \"acc_norm_stderr\": 0.03400598505599014\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42168674698795183,\n \"acc_stderr\": 0.03844453181770917,\n \"acc_norm\": 0.42168674698795183,\n \"acc_norm_stderr\": 0.03844453181770917\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.695906432748538,\n \"acc_stderr\": 0.0352821125824523,\n \"acc_norm\": 0.695906432748538,\n \"acc_norm_stderr\": 0.0352821125824523\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.24724602203182375,\n \"mc1_stderr\": 0.01510240479735965,\n \"mc2\": 0.38648505988264525,\n \"mc2_stderr\": 0.013533394304850859\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7411207576953434,\n \"acc_stderr\": 0.012310515810993376\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.12357846853677028,\n \"acc_stderr\": 0.009065050306776913\n }\n}\n```", "repo_url": "https://huggingface.co/tyson0420/stack_llama-clang", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_15T02_56_08.832354", "path": ["**/details_harness|arc:challenge|25_2024-02-15T02-56-08.832354.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-15T02-56-08.832354.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_15T02_56_08.832354", "path": ["**/details_harness|gsm8k|5_2024-02-15T02-56-08.832354.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-15T02-56-08.832354.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_15T02_56_08.832354", "path": ["**/details_harness|hellaswag|10_2024-02-15T02-56-08.832354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-15T02-56-08.832354.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_15T02_56_08.832354", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-15T02-56-08.832354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-15T02-56-08.832354.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-15T02-56-08.832354.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_15T02_56_08.832354", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T02-56-08.832354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T02-56-08.832354.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_15T02_56_08.832354", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-15T02-56-08.832354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-15T02-56-08.832354.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_15T02_56_08.832354", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-15T02-56-08.832354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-15T02-56-08.832354.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_15T02_56_08.832354", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T02-56-08.832354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T02-56-08.832354.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_15T02_56_08.832354", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T02-56-08.832354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T02-56-08.832354.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_15T02_56_08.832354", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-15T02-56-08.832354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-15T02-56-08.832354.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_15T02_56_08.832354", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T02-56-08.832354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T02-56-08.832354.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_15T02_56_08.832354", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T02-56-08.832354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T02-56-08.832354.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_15T02_56_08.832354", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T02-56-08.832354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T02-56-08.832354.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_15T02_56_08.832354", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T02-56-08.832354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T02-56-08.832354.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_15T02_56_08.832354", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-15T02-56-08.832354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-15T02-56-08.832354.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_15T02_56_08.832354", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-15T02-56-08.832354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-15T02-56-08.832354.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_15T02_56_08.832354", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T02-56-08.832354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T02-56-08.832354.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_15T02_56_08.832354", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-15T02-56-08.832354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-15T02-56-08.832354.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_15T02_56_08.832354", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T02-56-08.832354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T02-56-08.832354.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_15T02_56_08.832354", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T02-56-08.832354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T02-56-08.832354.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_15T02_56_08.832354", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T02-56-08.832354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T02-56-08.832354.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_15T02_56_08.832354", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-15T02-56-08.832354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-15T02-56-08.832354.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_15T02_56_08.832354", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T02-56-08.832354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T02-56-08.832354.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_15T02_56_08.832354", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T02-56-08.832354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T02-56-08.832354.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_15T02_56_08.832354", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T02-56-08.832354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T02-56-08.832354.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_15T02_56_08.832354", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T02-56-08.832354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T02-56-08.832354.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_15T02_56_08.832354", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T02-56-08.832354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T02-56-08.832354.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_15T02_56_08.832354", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T02-56-08.832354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T02-56-08.832354.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_15T02_56_08.832354", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T02-56-08.832354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T02-56-08.832354.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_15T02_56_08.832354", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T02-56-08.832354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T02-56-08.832354.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_15T02_56_08.832354", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T02-56-08.832354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T02-56-08.832354.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_15T02_56_08.832354", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T02-56-08.832354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T02-56-08.832354.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_15T02_56_08.832354", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T02-56-08.832354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T02-56-08.832354.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_15T02_56_08.832354", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T02-56-08.832354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T02-56-08.832354.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_15T02_56_08.832354", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T02-56-08.832354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T02-56-08.832354.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_15T02_56_08.832354", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T02-56-08.832354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T02-56-08.832354.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_15T02_56_08.832354", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-15T02-56-08.832354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-15T02-56-08.832354.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_15T02_56_08.832354", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T02-56-08.832354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T02-56-08.832354.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_15T02_56_08.832354", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-15T02-56-08.832354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-15T02-56-08.832354.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_15T02_56_08.832354", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T02-56-08.832354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T02-56-08.832354.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_15T02_56_08.832354", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T02-56-08.832354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T02-56-08.832354.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_15T02_56_08.832354", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T02-56-08.832354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T02-56-08.832354.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_15T02_56_08.832354", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-15T02-56-08.832354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-15T02-56-08.832354.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_15T02_56_08.832354", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-15T02-56-08.832354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-15T02-56-08.832354.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_15T02_56_08.832354", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T02-56-08.832354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T02-56-08.832354.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_15T02_56_08.832354", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T02-56-08.832354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T02-56-08.832354.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_15T02_56_08.832354", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T02-56-08.832354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T02-56-08.832354.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_15T02_56_08.832354", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T02-56-08.832354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T02-56-08.832354.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_15T02_56_08.832354", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-15T02-56-08.832354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-15T02-56-08.832354.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_15T02_56_08.832354", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-15T02-56-08.832354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-15T02-56-08.832354.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_15T02_56_08.832354", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-15T02-56-08.832354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-15T02-56-08.832354.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_15T02_56_08.832354", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T02-56-08.832354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T02-56-08.832354.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_15T02_56_08.832354", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-15T02-56-08.832354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-15T02-56-08.832354.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_15T02_56_08.832354", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T02-56-08.832354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T02-56-08.832354.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_15T02_56_08.832354", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T02-56-08.832354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T02-56-08.832354.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_15T02_56_08.832354", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-15T02-56-08.832354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-15T02-56-08.832354.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_15T02_56_08.832354", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-15T02-56-08.832354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-15T02-56-08.832354.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_15T02_56_08.832354", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-15T02-56-08.832354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-15T02-56-08.832354.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_15T02_56_08.832354", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T02-56-08.832354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T02-56-08.832354.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_15T02_56_08.832354", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-15T02-56-08.832354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-15T02-56-08.832354.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_15T02_56_08.832354", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-15T02-56-08.832354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-15T02-56-08.832354.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_15T02_56_08.832354", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-15T02-56-08.832354.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-15T02-56-08.832354.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_15T02_56_08.832354", "path": ["**/details_harness|winogrande|5_2024-02-15T02-56-08.832354.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-15T02-56-08.832354.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_15T02_56_08.832354", "path": ["results_2024-02-15T02-56-08.832354.parquet"]}, {"split": "latest", "path": ["results_2024-02-15T02-56-08.832354.parquet"]}]}]} | 2024-02-15T02:59:00+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of tyson0420/stack_llama-clang
Dataset automatically created during the evaluation run of model tyson0420/stack_llama-clang on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-15T02:56:08.832354(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of tyson0420/stack_llama-clang\n\n\n\nDataset automatically created during the evaluation run of model tyson0420/stack_llama-clang on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-15T02:56:08.832354(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of tyson0420/stack_llama-clang\n\n\n\nDataset automatically created during the evaluation run of model tyson0420/stack_llama-clang on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-15T02:56:08.832354(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
85497b9031af71f4183200e08d8ed50314e217e5 | # A corpus of rewritten pubmed abstracts
This corpus contains a 1k example subset from the [pubmed](https://huggingface.co/datasets/pubmed) corpus and various rewritten versions. The rewritten versions change one aspect of the orginal text and keeps other aspects unchanged as much as possible.
- **Paper:** [Dissecting learning and forgetting in language model finetuning](https://openreview.net/forum?id=tmsqb6WpLz)
Another corpus of rewritten general text is provided here: [c4_derived](https://huggingface.co/datasets/xiaozeroone/c4_derived)
### Data Splits
- pubmed: a 1k example subset from the original pubmed corpus
- nonbiomedical: main topic of text changed to nonbiomedical topic
- counerfactual: factuals knowledge in text replaced by incorrect factuals
- casual: style of text changed to a casual style
- rap: style of text changed to a rap style
## Dataset Creation
Text is generated by ChatGPT with corresponding prompts. Refer to the paper for the instructions used to generate text in each derived subsets.
Please check the terms and conditions of pubmed data [here](https://www.nlm.nih.gov/databases/download/terms_and_conditions.html).
### Citation Information
```
@inproceedings{
zhang2024dissecting,
title={Dissecting learning and forgetting in language model finetuning},
author={Xiao Zhang and Ji Wu},
booktitle={The Twelfth International Conference on Learning Representations},
year={2024},
url={https://openreview.net/forum?id=tmsqb6WpLz}
}
``` | xiaozeroone/pubmed_derived | [
"language:en",
"region:us"
] | 2024-02-15T03:08:29+00:00 | {"language": ["en"], "configs": [{"config_name": "default", "data_files": [{"split": "pubmed", "path": "data/pubmed-*"}, {"split": "nonbiomedical", "path": "data/nonbiomedical-*"}, {"split": "counterfactual", "path": "data/counterfactual-*"}, {"split": "casual", "path": "data/casual-*"}, {"split": "rap", "path": "data/rap-*"}]}], "dataset_info": {"features": [{"name": "PubmedData", "struct": [{"name": "ArticleIdList", "sequence": [{"name": "ArticleId", "sequence": "string"}]}, {"name": "PublicationStatus", "dtype": "string"}, {"name": "History", "struct": [{"name": "PubMedPubDate", "sequence": [{"name": "Year", "dtype": "int32"}, {"name": "Month", "dtype": "int32"}, {"name": "Day", "dtype": "int32"}]}]}, {"name": "ReferenceList", "sequence": [{"name": "Citation", "dtype": "string"}, {"name": "CitationId", "dtype": "int32"}]}]}, {"name": "text", "dtype": "string"}], "splits": [{"name": "pubmed", "num_bytes": 1166668, "num_examples": 1000}, {"name": "nonbiomedical", "num_bytes": 1141909, "num_examples": 1000}, {"name": "counterfactual", "num_bytes": 1179347, "num_examples": 991}, {"name": "casual", "num_bytes": 1205949, "num_examples": 1000}, {"name": "rap", "num_bytes": 1252260, "num_examples": 1000}], "download_size": 3357032, "dataset_size": 5946133}} | 2024-02-15T03:30:02+00:00 | [] | [
"en"
] | TAGS
#language-English #region-us
| # A corpus of rewritten pubmed abstracts
This corpus contains a 1k example subset from the pubmed corpus and various rewritten versions. The rewritten versions change one aspect of the orginal text and keeps other aspects unchanged as much as possible.
- Paper: Dissecting learning and forgetting in language model finetuning
Another corpus of rewritten general text is provided here: c4_derived
### Data Splits
- pubmed: a 1k example subset from the original pubmed corpus
- nonbiomedical: main topic of text changed to nonbiomedical topic
- counerfactual: factuals knowledge in text replaced by incorrect factuals
- casual: style of text changed to a casual style
- rap: style of text changed to a rap style
## Dataset Creation
Text is generated by ChatGPT with corresponding prompts. Refer to the paper for the instructions used to generate text in each derived subsets.
Please check the terms and conditions of pubmed data here.
| [
"# A corpus of rewritten pubmed abstracts\n\nThis corpus contains a 1k example subset from the pubmed corpus and various rewritten versions. The rewritten versions change one aspect of the orginal text and keeps other aspects unchanged as much as possible.\n\n- Paper: Dissecting learning and forgetting in language model finetuning\n\nAnother corpus of rewritten general text is provided here: c4_derived",
"### Data Splits\n\n- pubmed: a 1k example subset from the original pubmed corpus\n\n- nonbiomedical: main topic of text changed to nonbiomedical topic\n\n- counerfactual: factuals knowledge in text replaced by incorrect factuals\n\n- casual: style of text changed to a casual style\n\n- rap: style of text changed to a rap style",
"## Dataset Creation\n\nText is generated by ChatGPT with corresponding prompts. Refer to the paper for the instructions used to generate text in each derived subsets.\n\nPlease check the terms and conditions of pubmed data here."
] | [
"TAGS\n#language-English #region-us \n",
"# A corpus of rewritten pubmed abstracts\n\nThis corpus contains a 1k example subset from the pubmed corpus and various rewritten versions. The rewritten versions change one aspect of the orginal text and keeps other aspects unchanged as much as possible.\n\n- Paper: Dissecting learning and forgetting in language model finetuning\n\nAnother corpus of rewritten general text is provided here: c4_derived",
"### Data Splits\n\n- pubmed: a 1k example subset from the original pubmed corpus\n\n- nonbiomedical: main topic of text changed to nonbiomedical topic\n\n- counerfactual: factuals knowledge in text replaced by incorrect factuals\n\n- casual: style of text changed to a casual style\n\n- rap: style of text changed to a rap style",
"## Dataset Creation\n\nText is generated by ChatGPT with corresponding prompts. Refer to the paper for the instructions used to generate text in each derived subsets.\n\nPlease check the terms and conditions of pubmed data here."
] |
2600a8eeeae6f0c5d5ce3cb49c50d2682c962c48 | # Dataset Card for "c4_derived"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | xiaozeroone/c4_derived | [
"region:us"
] | 2024-02-15T03:25:20+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "c4", "path": "data/c4-*"}, {"split": "biomedical", "path": "data/biomedical-*"}, {"split": "counterfactual", "path": "data/counterfactual-*"}, {"split": "academic", "path": "data/academic-*"}]}], "dataset_info": {"features": [{"name": "text", "dtype": "string"}, {"name": "url", "dtype": "string"}], "splits": [{"name": "c4", "num_bytes": 1820234, "num_examples": 1000}, {"name": "biomedical", "num_bytes": 1803036, "num_examples": 989}, {"name": "counterfactual", "num_bytes": 1813882, "num_examples": 985}, {"name": "academic", "num_bytes": 1199491, "num_examples": 986}], "download_size": 4124290, "dataset_size": 6636643}} | 2023-10-08T11:33:07+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "c4_derived"
More Information needed | [
"# Dataset Card for \"c4_derived\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"c4_derived\"\n\nMore Information needed"
] |
e7366ae71f3b418de6a6558ba5ed066c726b0608 |
# Dataset Card for Evaluation run of Yuma42/KangalKhan-Ruby-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Yuma42/KangalKhan-Ruby-7B](https://huggingface.co/Yuma42/KangalKhan-Ruby-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Yuma42__KangalKhan-Ruby-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-15T04:27:32.929090](https://huggingface.co/datasets/open-llm-leaderboard/details_Yuma42__KangalKhan-Ruby-7B/blob/main/results_2024-02-15T04-27-32.929090.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6347473013166217,
"acc_stderr": 0.032259927653459516,
"acc_norm": 0.6365178163764577,
"acc_norm_stderr": 0.03290211517224385,
"mc1": 0.38922888616891066,
"mc1_stderr": 0.01706855268069033,
"mc2": 0.5648879973341684,
"mc2_stderr": 0.01540236564556069
},
"harness|arc:challenge|25": {
"acc": 0.6220136518771331,
"acc_stderr": 0.014169664520303098,
"acc_norm": 0.6723549488054608,
"acc_norm_stderr": 0.013715847940719337
},
"harness|hellaswag|10": {
"acc": 0.6683927504481179,
"acc_stderr": 0.004698285350019216,
"acc_norm": 0.8522206731726748,
"acc_norm_stderr": 0.00354155826377909
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6867924528301886,
"acc_stderr": 0.028544793319055326,
"acc_norm": 0.6867924528301886,
"acc_norm_stderr": 0.028544793319055326
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5953757225433526,
"acc_stderr": 0.03742461193887248,
"acc_norm": 0.5953757225433526,
"acc_norm_stderr": 0.03742461193887248
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.04784060704105653,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.04784060704105653
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108101,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108101
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.0470070803355104,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.0470070803355104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.025331202438944433,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.025331202438944433
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.023287665127268552,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.023287665127268552
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.65,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.029376616484945633,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.029376616484945633
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919443,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6102564102564103,
"acc_stderr": 0.024726967886647074,
"acc_norm": 0.6102564102564103,
"acc_norm_stderr": 0.024726967886647074
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.02822644674968352,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.02822644674968352
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.030176808288974337,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.030176808288974337
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658752,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658752
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8385321100917431,
"acc_stderr": 0.015776239256163227,
"acc_norm": 0.8385321100917431,
"acc_norm_stderr": 0.015776239256163227
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.027865942286639318,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.027865942286639318
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290913,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290913
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.03641297081313728,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.03641297081313728
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.022801382534597528,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.022801382534597528
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8263090676883781,
"acc_stderr": 0.01354741565866226,
"acc_norm": 0.8263090676883781,
"acc_norm_stderr": 0.01354741565866226
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7138728323699421,
"acc_stderr": 0.02433214677913413,
"acc_norm": 0.7138728323699421,
"acc_norm_stderr": 0.02433214677913413
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3307262569832402,
"acc_stderr": 0.01573502625896612,
"acc_norm": 0.3307262569832402,
"acc_norm_stderr": 0.01573502625896612
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7418300653594772,
"acc_stderr": 0.02505850331695814,
"acc_norm": 0.7418300653594772,
"acc_norm_stderr": 0.02505850331695814
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6913183279742765,
"acc_stderr": 0.026236965881153266,
"acc_norm": 0.6913183279742765,
"acc_norm_stderr": 0.026236965881153266
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7438271604938271,
"acc_stderr": 0.024288533637726095,
"acc_norm": 0.7438271604938271,
"acc_norm_stderr": 0.024288533637726095
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5,
"acc_stderr": 0.029827499313594685,
"acc_norm": 0.5,
"acc_norm_stderr": 0.029827499313594685
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47327249022164275,
"acc_stderr": 0.012751977967676008,
"acc_norm": 0.47327249022164275,
"acc_norm_stderr": 0.012751977967676008
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.02841820861940676,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.02841820861940676
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6748366013071896,
"acc_stderr": 0.01895088677080631,
"acc_norm": 0.6748366013071896,
"acc_norm_stderr": 0.01895088677080631
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.04494290866252091,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.04494290866252091
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142773,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142773
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8109452736318408,
"acc_stderr": 0.027686913588013003,
"acc_norm": 0.8109452736318408,
"acc_norm_stderr": 0.027686913588013003
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685517,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685517
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.38922888616891066,
"mc1_stderr": 0.01706855268069033,
"mc2": 0.5648879973341684,
"mc2_stderr": 0.01540236564556069
},
"harness|winogrande|5": {
"acc": 0.7797947908445146,
"acc_stderr": 0.011646276755089693
},
"harness|gsm8k|5": {
"acc": 0.6194086429112965,
"acc_stderr": 0.013373971277729818
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Yuma42__KangalKhan-Ruby-7B | [
"region:us"
] | 2024-02-15T04:24:37+00:00 | {"pretty_name": "Evaluation run of Yuma42/KangalKhan-Ruby-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [Yuma42/KangalKhan-Ruby-7B](https://huggingface.co/Yuma42/KangalKhan-Ruby-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Yuma42__KangalKhan-Ruby-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-15T04:27:32.929090](https://huggingface.co/datasets/open-llm-leaderboard/details_Yuma42__KangalKhan-Ruby-7B/blob/main/results_2024-02-15T04-27-32.929090.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6347473013166217,\n \"acc_stderr\": 0.032259927653459516,\n \"acc_norm\": 0.6365178163764577,\n \"acc_norm_stderr\": 0.03290211517224385,\n \"mc1\": 0.38922888616891066,\n \"mc1_stderr\": 0.01706855268069033,\n \"mc2\": 0.5648879973341684,\n \"mc2_stderr\": 0.01540236564556069\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6220136518771331,\n \"acc_stderr\": 0.014169664520303098,\n \"acc_norm\": 0.6723549488054608,\n \"acc_norm_stderr\": 0.013715847940719337\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6683927504481179,\n \"acc_stderr\": 0.004698285350019216,\n \"acc_norm\": 0.8522206731726748,\n \"acc_norm_stderr\": 0.00354155826377909\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.028544793319055326,\n \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.028544793319055326\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5953757225433526,\n \"acc_stderr\": 0.03742461193887248,\n \"acc_norm\": 0.5953757225433526,\n \"acc_norm_stderr\": 0.03742461193887248\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105653,\n \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105653\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108101,\n \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108101\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.0470070803355104,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.0470070803355104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41005291005291006,\n \"acc_stderr\": 0.025331202438944433,\n \"acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.025331202438944433\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n \"acc_stderr\": 0.023287665127268552,\n \"acc_norm\": 0.7870967741935484,\n \"acc_norm_stderr\": 0.023287665127268552\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.04793724854411019,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.04793724854411019\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7828282828282829,\n \"acc_stderr\": 0.029376616484945633,\n \"acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.029376616484945633\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919443,\n \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919443\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6102564102564103,\n \"acc_stderr\": 0.024726967886647074,\n \"acc_norm\": 0.6102564102564103,\n \"acc_norm_stderr\": 0.024726967886647074\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3111111111111111,\n \"acc_stderr\": 0.02822644674968352,\n \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.02822644674968352\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658752,\n \"acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658752\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8385321100917431,\n \"acc_stderr\": 0.015776239256163227,\n \"acc_norm\": 0.8385321100917431,\n \"acc_norm_stderr\": 0.015776239256163227\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.803921568627451,\n \"acc_stderr\": 0.027865942286639318,\n \"acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.027865942286639318\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290913,\n \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290913\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.03641297081313728,\n \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.03641297081313728\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n \"acc_stderr\": 0.022801382534597528,\n \"acc_norm\": 0.8589743589743589,\n \"acc_norm_stderr\": 0.022801382534597528\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n \"acc_stderr\": 0.01354741565866226,\n \"acc_norm\": 0.8263090676883781,\n \"acc_norm_stderr\": 0.01354741565866226\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7138728323699421,\n \"acc_stderr\": 0.02433214677913413,\n \"acc_norm\": 0.7138728323699421,\n \"acc_norm_stderr\": 0.02433214677913413\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3307262569832402,\n \"acc_stderr\": 0.01573502625896612,\n \"acc_norm\": 0.3307262569832402,\n \"acc_norm_stderr\": 0.01573502625896612\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7418300653594772,\n \"acc_stderr\": 0.02505850331695814,\n \"acc_norm\": 0.7418300653594772,\n \"acc_norm_stderr\": 0.02505850331695814\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6913183279742765,\n \"acc_stderr\": 0.026236965881153266,\n \"acc_norm\": 0.6913183279742765,\n \"acc_norm_stderr\": 0.026236965881153266\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.024288533637726095,\n \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.024288533637726095\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.029827499313594685,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.029827499313594685\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47327249022164275,\n \"acc_stderr\": 0.012751977967676008,\n \"acc_norm\": 0.47327249022164275,\n \"acc_norm_stderr\": 0.012751977967676008\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.02841820861940676,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.02841820861940676\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6748366013071896,\n \"acc_stderr\": 0.01895088677080631,\n \"acc_norm\": 0.6748366013071896,\n \"acc_norm_stderr\": 0.01895088677080631\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.04494290866252091,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.04494290866252091\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142773,\n \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142773\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8109452736318408,\n \"acc_stderr\": 0.027686913588013003,\n \"acc_norm\": 0.8109452736318408,\n \"acc_norm_stderr\": 0.027686913588013003\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n \"acc_stderr\": 0.03858158940685517,\n \"acc_norm\": 0.5662650602409639,\n \"acc_norm_stderr\": 0.03858158940685517\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.38922888616891066,\n \"mc1_stderr\": 0.01706855268069033,\n \"mc2\": 0.5648879973341684,\n \"mc2_stderr\": 0.01540236564556069\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7797947908445146,\n \"acc_stderr\": 0.011646276755089693\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6194086429112965,\n \"acc_stderr\": 0.013373971277729818\n }\n}\n```", "repo_url": "https://huggingface.co/Yuma42/KangalKhan-Ruby-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_15T04_22_19.518386", "path": ["**/details_harness|arc:challenge|25_2024-02-15T04-22-19.518386.parquet"]}, {"split": "2024_02_15T04_27_32.929090", "path": ["**/details_harness|arc:challenge|25_2024-02-15T04-27-32.929090.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-15T04-27-32.929090.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_15T04_22_19.518386", "path": ["**/details_harness|gsm8k|5_2024-02-15T04-22-19.518386.parquet"]}, {"split": "2024_02_15T04_27_32.929090", "path": ["**/details_harness|gsm8k|5_2024-02-15T04-27-32.929090.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-15T04-27-32.929090.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_15T04_22_19.518386", "path": ["**/details_harness|hellaswag|10_2024-02-15T04-22-19.518386.parquet"]}, {"split": "2024_02_15T04_27_32.929090", "path": ["**/details_harness|hellaswag|10_2024-02-15T04-27-32.929090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-15T04-27-32.929090.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_15T04_22_19.518386", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T04-22-19.518386.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-15T04-22-19.518386.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-15T04-22-19.518386.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T04-22-19.518386.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T04-22-19.518386.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-15T04-22-19.518386.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T04-22-19.518386.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T04-22-19.518386.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T04-22-19.518386.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T04-22-19.518386.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-15T04-22-19.518386.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-15T04-22-19.518386.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T04-22-19.518386.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-15T04-22-19.518386.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T04-22-19.518386.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T04-22-19.518386.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T04-22-19.518386.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-15T04-22-19.518386.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T04-22-19.518386.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T04-22-19.518386.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T04-22-19.518386.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T04-22-19.518386.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T04-22-19.518386.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T04-22-19.518386.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T04-22-19.518386.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T04-22-19.518386.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T04-22-19.518386.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T04-22-19.518386.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T04-22-19.518386.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T04-22-19.518386.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T04-22-19.518386.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T04-22-19.518386.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-15T04-22-19.518386.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T04-22-19.518386.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-15T04-22-19.518386.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T04-22-19.518386.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T04-22-19.518386.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T04-22-19.518386.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-15T04-22-19.518386.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-15T04-22-19.518386.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T04-22-19.518386.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T04-22-19.518386.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T04-22-19.518386.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T04-22-19.518386.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-15T04-22-19.518386.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-15T04-22-19.518386.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-15T04-22-19.518386.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T04-22-19.518386.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-15T04-22-19.518386.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T04-22-19.518386.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T04-22-19.518386.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-15T04-22-19.518386.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-15T04-22-19.518386.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-15T04-22-19.518386.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T04-22-19.518386.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-15T04-22-19.518386.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-15T04-22-19.518386.parquet"]}, {"split": "2024_02_15T04_27_32.929090", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-15T04-27-32.929090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-15T04-27-32.929090.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-15T04-27-32.929090.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_15T04_22_19.518386", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T04-22-19.518386.parquet"]}, {"split": "2024_02_15T04_27_32.929090", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T04-27-32.929090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T04-27-32.929090.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_15T04_22_19.518386", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-15T04-22-19.518386.parquet"]}, {"split": "2024_02_15T04_27_32.929090", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-15T04-27-32.929090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-15T04-27-32.929090.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_15T04_22_19.518386", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-15T04-22-19.518386.parquet"]}, {"split": "2024_02_15T04_27_32.929090", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-15T04-27-32.929090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-15T04-27-32.929090.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_15T04_22_19.518386", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T04-22-19.518386.parquet"]}, {"split": "2024_02_15T04_27_32.929090", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T04-27-32.929090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T04-27-32.929090.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_15T04_22_19.518386", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T04-22-19.518386.parquet"]}, {"split": "2024_02_15T04_27_32.929090", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T04-27-32.929090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T04-27-32.929090.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_15T04_22_19.518386", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-15T04-22-19.518386.parquet"]}, {"split": "2024_02_15T04_27_32.929090", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-15T04-27-32.929090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-15T04-27-32.929090.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_15T04_22_19.518386", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T04-22-19.518386.parquet"]}, {"split": "2024_02_15T04_27_32.929090", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T04-27-32.929090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T04-27-32.929090.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_15T04_22_19.518386", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T04-22-19.518386.parquet"]}, {"split": "2024_02_15T04_27_32.929090", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T04-27-32.929090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T04-27-32.929090.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_15T04_22_19.518386", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T04-22-19.518386.parquet"]}, {"split": "2024_02_15T04_27_32.929090", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T04-27-32.929090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T04-27-32.929090.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_15T04_22_19.518386", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T04-22-19.518386.parquet"]}, {"split": "2024_02_15T04_27_32.929090", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T04-27-32.929090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T04-27-32.929090.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_15T04_22_19.518386", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-15T04-22-19.518386.parquet"]}, {"split": "2024_02_15T04_27_32.929090", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-15T04-27-32.929090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-15T04-27-32.929090.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_15T04_22_19.518386", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-15T04-22-19.518386.parquet"]}, {"split": "2024_02_15T04_27_32.929090", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-15T04-27-32.929090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-15T04-27-32.929090.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_15T04_22_19.518386", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T04-22-19.518386.parquet"]}, {"split": "2024_02_15T04_27_32.929090", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T04-27-32.929090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T04-27-32.929090.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_15T04_22_19.518386", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-15T04-22-19.518386.parquet"]}, {"split": "2024_02_15T04_27_32.929090", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-15T04-27-32.929090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-15T04-27-32.929090.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_15T04_22_19.518386", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T04-22-19.518386.parquet"]}, {"split": "2024_02_15T04_27_32.929090", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T04-27-32.929090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T04-27-32.929090.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_15T04_22_19.518386", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T04-22-19.518386.parquet"]}, {"split": "2024_02_15T04_27_32.929090", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T04-27-32.929090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T04-27-32.929090.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_15T04_22_19.518386", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T04-22-19.518386.parquet"]}, {"split": "2024_02_15T04_27_32.929090", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T04-27-32.929090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T04-27-32.929090.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_15T04_22_19.518386", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-15T04-22-19.518386.parquet"]}, {"split": "2024_02_15T04_27_32.929090", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-15T04-27-32.929090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-15T04-27-32.929090.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_15T04_22_19.518386", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T04-22-19.518386.parquet"]}, {"split": "2024_02_15T04_27_32.929090", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T04-27-32.929090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T04-27-32.929090.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_15T04_22_19.518386", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T04-22-19.518386.parquet"]}, {"split": "2024_02_15T04_27_32.929090", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T04-27-32.929090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T04-27-32.929090.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_15T04_22_19.518386", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T04-22-19.518386.parquet"]}, {"split": "2024_02_15T04_27_32.929090", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T04-27-32.929090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T04-27-32.929090.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_15T04_22_19.518386", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T04-22-19.518386.parquet"]}, {"split": "2024_02_15T04_27_32.929090", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T04-27-32.929090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T04-27-32.929090.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_15T04_22_19.518386", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T04-22-19.518386.parquet"]}, {"split": "2024_02_15T04_27_32.929090", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T04-27-32.929090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T04-27-32.929090.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_15T04_22_19.518386", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T04-22-19.518386.parquet"]}, {"split": "2024_02_15T04_27_32.929090", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T04-27-32.929090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T04-27-32.929090.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_15T04_22_19.518386", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T04-22-19.518386.parquet"]}, {"split": "2024_02_15T04_27_32.929090", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T04-27-32.929090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T04-27-32.929090.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_15T04_22_19.518386", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T04-22-19.518386.parquet"]}, {"split": "2024_02_15T04_27_32.929090", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T04-27-32.929090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T04-27-32.929090.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_15T04_22_19.518386", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T04-22-19.518386.parquet"]}, {"split": "2024_02_15T04_27_32.929090", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T04-27-32.929090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T04-27-32.929090.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_15T04_22_19.518386", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T04-22-19.518386.parquet"]}, {"split": "2024_02_15T04_27_32.929090", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T04-27-32.929090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T04-27-32.929090.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_15T04_22_19.518386", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T04-22-19.518386.parquet"]}, {"split": "2024_02_15T04_27_32.929090", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T04-27-32.929090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T04-27-32.929090.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_15T04_22_19.518386", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T04-22-19.518386.parquet"]}, {"split": "2024_02_15T04_27_32.929090", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T04-27-32.929090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T04-27-32.929090.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_15T04_22_19.518386", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T04-22-19.518386.parquet"]}, {"split": "2024_02_15T04_27_32.929090", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T04-27-32.929090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T04-27-32.929090.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_15T04_22_19.518386", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T04-22-19.518386.parquet"]}, {"split": "2024_02_15T04_27_32.929090", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T04-27-32.929090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T04-27-32.929090.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_15T04_22_19.518386", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-15T04-22-19.518386.parquet"]}, {"split": "2024_02_15T04_27_32.929090", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-15T04-27-32.929090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-15T04-27-32.929090.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_15T04_22_19.518386", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T04-22-19.518386.parquet"]}, {"split": "2024_02_15T04_27_32.929090", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T04-27-32.929090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T04-27-32.929090.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_15T04_22_19.518386", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-15T04-22-19.518386.parquet"]}, {"split": "2024_02_15T04_27_32.929090", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-15T04-27-32.929090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-15T04-27-32.929090.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_15T04_22_19.518386", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T04-22-19.518386.parquet"]}, {"split": "2024_02_15T04_27_32.929090", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T04-27-32.929090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T04-27-32.929090.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_15T04_22_19.518386", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T04-22-19.518386.parquet"]}, {"split": "2024_02_15T04_27_32.929090", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T04-27-32.929090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T04-27-32.929090.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_15T04_22_19.518386", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T04-22-19.518386.parquet"]}, {"split": "2024_02_15T04_27_32.929090", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T04-27-32.929090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T04-27-32.929090.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_15T04_22_19.518386", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-15T04-22-19.518386.parquet"]}, {"split": "2024_02_15T04_27_32.929090", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-15T04-27-32.929090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-15T04-27-32.929090.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_15T04_22_19.518386", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-15T04-22-19.518386.parquet"]}, {"split": "2024_02_15T04_27_32.929090", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-15T04-27-32.929090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-15T04-27-32.929090.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_15T04_22_19.518386", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T04-22-19.518386.parquet"]}, {"split": "2024_02_15T04_27_32.929090", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T04-27-32.929090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T04-27-32.929090.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_15T04_22_19.518386", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T04-22-19.518386.parquet"]}, {"split": "2024_02_15T04_27_32.929090", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T04-27-32.929090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T04-27-32.929090.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_15T04_22_19.518386", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T04-22-19.518386.parquet"]}, {"split": "2024_02_15T04_27_32.929090", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T04-27-32.929090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T04-27-32.929090.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_15T04_22_19.518386", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T04-22-19.518386.parquet"]}, {"split": "2024_02_15T04_27_32.929090", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T04-27-32.929090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T04-27-32.929090.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_15T04_22_19.518386", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-15T04-22-19.518386.parquet"]}, {"split": "2024_02_15T04_27_32.929090", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-15T04-27-32.929090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-15T04-27-32.929090.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_15T04_22_19.518386", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-15T04-22-19.518386.parquet"]}, {"split": "2024_02_15T04_27_32.929090", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-15T04-27-32.929090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-15T04-27-32.929090.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_15T04_22_19.518386", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-15T04-22-19.518386.parquet"]}, {"split": "2024_02_15T04_27_32.929090", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-15T04-27-32.929090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-15T04-27-32.929090.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_15T04_22_19.518386", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T04-22-19.518386.parquet"]}, {"split": "2024_02_15T04_27_32.929090", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T04-27-32.929090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T04-27-32.929090.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_15T04_22_19.518386", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-15T04-22-19.518386.parquet"]}, {"split": "2024_02_15T04_27_32.929090", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-15T04-27-32.929090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-15T04-27-32.929090.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_15T04_22_19.518386", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T04-22-19.518386.parquet"]}, {"split": "2024_02_15T04_27_32.929090", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T04-27-32.929090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T04-27-32.929090.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_15T04_22_19.518386", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T04-22-19.518386.parquet"]}, {"split": "2024_02_15T04_27_32.929090", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T04-27-32.929090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T04-27-32.929090.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_15T04_22_19.518386", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-15T04-22-19.518386.parquet"]}, {"split": "2024_02_15T04_27_32.929090", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-15T04-27-32.929090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-15T04-27-32.929090.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_15T04_22_19.518386", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-15T04-22-19.518386.parquet"]}, {"split": "2024_02_15T04_27_32.929090", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-15T04-27-32.929090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-15T04-27-32.929090.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_15T04_22_19.518386", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-15T04-22-19.518386.parquet"]}, {"split": "2024_02_15T04_27_32.929090", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-15T04-27-32.929090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-15T04-27-32.929090.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_15T04_22_19.518386", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T04-22-19.518386.parquet"]}, {"split": "2024_02_15T04_27_32.929090", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T04-27-32.929090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T04-27-32.929090.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_15T04_22_19.518386", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-15T04-22-19.518386.parquet"]}, {"split": "2024_02_15T04_27_32.929090", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-15T04-27-32.929090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-15T04-27-32.929090.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_15T04_22_19.518386", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-15T04-22-19.518386.parquet"]}, {"split": "2024_02_15T04_27_32.929090", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-15T04-27-32.929090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-15T04-27-32.929090.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_15T04_22_19.518386", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-15T04-22-19.518386.parquet"]}, {"split": "2024_02_15T04_27_32.929090", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-15T04-27-32.929090.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-15T04-27-32.929090.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_15T04_22_19.518386", "path": ["**/details_harness|winogrande|5_2024-02-15T04-22-19.518386.parquet"]}, {"split": "2024_02_15T04_27_32.929090", "path": ["**/details_harness|winogrande|5_2024-02-15T04-27-32.929090.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-15T04-27-32.929090.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_15T04_22_19.518386", "path": ["results_2024-02-15T04-22-19.518386.parquet"]}, {"split": "2024_02_15T04_27_32.929090", "path": ["results_2024-02-15T04-27-32.929090.parquet"]}, {"split": "latest", "path": ["results_2024-02-15T04-27-32.929090.parquet"]}]}]} | 2024-02-15T04:29:52+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Yuma42/KangalKhan-Ruby-7B
Dataset automatically created during the evaluation run of model Yuma42/KangalKhan-Ruby-7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-15T04:27:32.929090(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Yuma42/KangalKhan-Ruby-7B\n\n\n\nDataset automatically created during the evaluation run of model Yuma42/KangalKhan-Ruby-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-15T04:27:32.929090(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Yuma42/KangalKhan-Ruby-7B\n\n\n\nDataset automatically created during the evaluation run of model Yuma42/KangalKhan-Ruby-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-15T04:27:32.929090(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
91e924244de192a0c986c29c24da25b46dc1740b |
# Dataset Card for Evaluation run of DreadPoor/ToppyLake-Bagel-7B-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [DreadPoor/ToppyLake-Bagel-7B-slerp](https://huggingface.co/DreadPoor/ToppyLake-Bagel-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_DreadPoor__ToppyLake-Bagel-7B-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-15T04:32:24.928446](https://huggingface.co/datasets/open-llm-leaderboard/details_DreadPoor__ToppyLake-Bagel-7B-slerp/blob/main/results_2024-02-15T04-32-24.928446.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6512988710006998,
"acc_stderr": 0.03210207549502817,
"acc_norm": 0.6526926609673599,
"acc_norm_stderr": 0.03275941308944721,
"mc1": 0.45532435740514077,
"mc1_stderr": 0.01743349010253877,
"mc2": 0.6173825040313937,
"mc2_stderr": 0.015437835217405107
},
"harness|arc:challenge|25": {
"acc": 0.6621160409556314,
"acc_stderr": 0.013822047922283509,
"acc_norm": 0.6766211604095563,
"acc_norm_stderr": 0.013669421630012129
},
"harness|hellaswag|10": {
"acc": 0.6830312686715794,
"acc_stderr": 0.004643441945489851,
"acc_norm": 0.8570005974905397,
"acc_norm_stderr": 0.0034935679140932915
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.027943219989337142,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.027943219989337142
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.035868792800803406,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.035868792800803406
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.035676037996391706,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.035676037996391706
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932262,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932262
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42328042328042326,
"acc_stderr": 0.025446365634406783,
"acc_norm": 0.42328042328042326,
"acc_norm_stderr": 0.025446365634406783
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723302,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723302
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5270935960591133,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.5270935960591133,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.0274796030105388,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.0274796030105388
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.023381935348121434,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.023381935348121434
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6692307692307692,
"acc_stderr": 0.02385479568097112,
"acc_norm": 0.6692307692307692,
"acc_norm_stderr": 0.02385479568097112
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.029185714949857416,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.029185714949857416
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.02934457250063433,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.02934457250063433
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8385321100917431,
"acc_stderr": 0.015776239256163248,
"acc_norm": 0.8385321100917431,
"acc_norm_stderr": 0.015776239256163248
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.03400603625538271,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.03400603625538271
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.025524722324553332,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.025524722324553332
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.026750826994676166,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.026750826994676166
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7085201793721974,
"acc_stderr": 0.030500283176545847,
"acc_norm": 0.7085201793721974,
"acc_norm_stderr": 0.030500283176545847
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406957,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406957
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8339719029374202,
"acc_stderr": 0.013306478243066302,
"acc_norm": 0.8339719029374202,
"acc_norm_stderr": 0.013306478243066302
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7167630057803468,
"acc_stderr": 0.024257901705323374,
"acc_norm": 0.7167630057803468,
"acc_norm_stderr": 0.024257901705323374
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3541899441340782,
"acc_stderr": 0.015995644947299232,
"acc_norm": 0.3541899441340782,
"acc_norm_stderr": 0.015995644947299232
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.025457756696667878,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.025457756696667878
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7438271604938271,
"acc_stderr": 0.024288533637726095,
"acc_norm": 0.7438271604938271,
"acc_norm_stderr": 0.024288533637726095
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.475177304964539,
"acc_stderr": 0.029790719243829727,
"acc_norm": 0.475177304964539,
"acc_norm_stderr": 0.029790719243829727
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4654498044328553,
"acc_stderr": 0.012739711554045704,
"acc_norm": 0.4654498044328553,
"acc_norm_stderr": 0.012739711554045704
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7022058823529411,
"acc_stderr": 0.027778298701545443,
"acc_norm": 0.7022058823529411,
"acc_norm_stderr": 0.027778298701545443
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6601307189542484,
"acc_stderr": 0.019162418588623557,
"acc_norm": 0.6601307189542484,
"acc_norm_stderr": 0.019162418588623557
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.025196929874827072,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.025196929874827072
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8538011695906432,
"acc_stderr": 0.027097290118070813,
"acc_norm": 0.8538011695906432,
"acc_norm_stderr": 0.027097290118070813
},
"harness|truthfulqa:mc|0": {
"mc1": 0.45532435740514077,
"mc1_stderr": 0.01743349010253877,
"mc2": 0.6173825040313937,
"mc2_stderr": 0.015437835217405107
},
"harness|winogrande|5": {
"acc": 0.8318863456985004,
"acc_stderr": 0.01051033695416674
},
"harness|gsm8k|5": {
"acc": 0.5769522365428355,
"acc_stderr": 0.013608395641498405
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_DreadPoor__ToppyLake-Bagel-7B-slerp | [
"region:us"
] | 2024-02-15T04:34:42+00:00 | {"pretty_name": "Evaluation run of DreadPoor/ToppyLake-Bagel-7B-slerp", "dataset_summary": "Dataset automatically created during the evaluation run of model [DreadPoor/ToppyLake-Bagel-7B-slerp](https://huggingface.co/DreadPoor/ToppyLake-Bagel-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_DreadPoor__ToppyLake-Bagel-7B-slerp\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-15T04:32:24.928446](https://huggingface.co/datasets/open-llm-leaderboard/details_DreadPoor__ToppyLake-Bagel-7B-slerp/blob/main/results_2024-02-15T04-32-24.928446.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6512988710006998,\n \"acc_stderr\": 0.03210207549502817,\n \"acc_norm\": 0.6526926609673599,\n \"acc_norm_stderr\": 0.03275941308944721,\n \"mc1\": 0.45532435740514077,\n \"mc1_stderr\": 0.01743349010253877,\n \"mc2\": 0.6173825040313937,\n \"mc2_stderr\": 0.015437835217405107\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6621160409556314,\n \"acc_stderr\": 0.013822047922283509,\n \"acc_norm\": 0.6766211604095563,\n \"acc_norm_stderr\": 0.013669421630012129\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6830312686715794,\n \"acc_stderr\": 0.004643441945489851,\n \"acc_norm\": 0.8570005974905397,\n \"acc_norm_stderr\": 0.0034935679140932915\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.027943219989337142,\n \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.027943219989337142\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n \"acc_stderr\": 0.035868792800803406,\n \"acc_norm\": 0.7569444444444444,\n \"acc_norm_stderr\": 0.035868792800803406\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n \"acc_stderr\": 0.035676037996391706,\n \"acc_norm\": 0.6763005780346821,\n \"acc_norm_stderr\": 0.035676037996391706\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932262,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932262\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42328042328042326,\n \"acc_stderr\": 0.025446365634406783,\n \"acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.025446365634406783\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n \"acc_stderr\": 0.023540799358723302,\n \"acc_norm\": 0.7806451612903226,\n \"acc_norm_stderr\": 0.023540799358723302\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5270935960591133,\n \"acc_stderr\": 0.03512819077876106,\n \"acc_norm\": 0.5270935960591133,\n \"acc_norm_stderr\": 0.03512819077876106\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8181818181818182,\n \"acc_stderr\": 0.0274796030105388,\n \"acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.0274796030105388\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.023381935348121434,\n \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.023381935348121434\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6692307692307692,\n \"acc_stderr\": 0.02385479568097112,\n \"acc_norm\": 0.6692307692307692,\n \"acc_norm_stderr\": 0.02385479568097112\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.35555555555555557,\n \"acc_stderr\": 0.029185714949857416,\n \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.029185714949857416\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.02934457250063433,\n \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.02934457250063433\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8385321100917431,\n \"acc_stderr\": 0.015776239256163248,\n \"acc_norm\": 0.8385321100917431,\n \"acc_norm_stderr\": 0.015776239256163248\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5370370370370371,\n \"acc_stderr\": 0.03400603625538271,\n \"acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.03400603625538271\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8431372549019608,\n \"acc_stderr\": 0.025524722324553332,\n \"acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.025524722324553332\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7848101265822784,\n \"acc_stderr\": 0.026750826994676166,\n \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.026750826994676166\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7085201793721974,\n \"acc_stderr\": 0.030500283176545847,\n \"acc_norm\": 0.7085201793721974,\n \"acc_norm_stderr\": 0.030500283176545847\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406957,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406957\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8339719029374202,\n \"acc_stderr\": 0.013306478243066302,\n \"acc_norm\": 0.8339719029374202,\n \"acc_norm_stderr\": 0.013306478243066302\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.024257901705323374,\n \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.024257901705323374\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3541899441340782,\n \"acc_stderr\": 0.015995644947299232,\n \"acc_norm\": 0.3541899441340782,\n \"acc_norm_stderr\": 0.015995644947299232\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.025457756696667878,\n \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.025457756696667878\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.024288533637726095,\n \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.024288533637726095\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.475177304964539,\n \"acc_stderr\": 0.029790719243829727,\n \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.029790719243829727\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4654498044328553,\n \"acc_stderr\": 0.012739711554045704,\n \"acc_norm\": 0.4654498044328553,\n \"acc_norm_stderr\": 0.012739711554045704\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7022058823529411,\n \"acc_stderr\": 0.027778298701545443,\n \"acc_norm\": 0.7022058823529411,\n \"acc_norm_stderr\": 0.027778298701545443\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6601307189542484,\n \"acc_stderr\": 0.019162418588623557,\n \"acc_norm\": 0.6601307189542484,\n \"acc_norm_stderr\": 0.019162418588623557\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n \"acc_stderr\": 0.025196929874827072,\n \"acc_norm\": 0.8507462686567164,\n \"acc_norm_stderr\": 0.025196929874827072\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8538011695906432,\n \"acc_stderr\": 0.027097290118070813,\n \"acc_norm\": 0.8538011695906432,\n \"acc_norm_stderr\": 0.027097290118070813\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.45532435740514077,\n \"mc1_stderr\": 0.01743349010253877,\n \"mc2\": 0.6173825040313937,\n \"mc2_stderr\": 0.015437835217405107\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8318863456985004,\n \"acc_stderr\": 0.01051033695416674\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5769522365428355,\n \"acc_stderr\": 0.013608395641498405\n }\n}\n```", "repo_url": "https://huggingface.co/DreadPoor/ToppyLake-Bagel-7B-slerp", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_15T04_32_24.928446", "path": ["**/details_harness|arc:challenge|25_2024-02-15T04-32-24.928446.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-15T04-32-24.928446.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_15T04_32_24.928446", "path": ["**/details_harness|gsm8k|5_2024-02-15T04-32-24.928446.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-15T04-32-24.928446.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_15T04_32_24.928446", "path": ["**/details_harness|hellaswag|10_2024-02-15T04-32-24.928446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-15T04-32-24.928446.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_15T04_32_24.928446", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-15T04-32-24.928446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-15T04-32-24.928446.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-15T04-32-24.928446.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_15T04_32_24.928446", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T04-32-24.928446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T04-32-24.928446.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_15T04_32_24.928446", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-15T04-32-24.928446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-15T04-32-24.928446.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_15T04_32_24.928446", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-15T04-32-24.928446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-15T04-32-24.928446.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_15T04_32_24.928446", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T04-32-24.928446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T04-32-24.928446.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_15T04_32_24.928446", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T04-32-24.928446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T04-32-24.928446.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_15T04_32_24.928446", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-15T04-32-24.928446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-15T04-32-24.928446.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_15T04_32_24.928446", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T04-32-24.928446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T04-32-24.928446.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_15T04_32_24.928446", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T04-32-24.928446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T04-32-24.928446.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_15T04_32_24.928446", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T04-32-24.928446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T04-32-24.928446.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_15T04_32_24.928446", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T04-32-24.928446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T04-32-24.928446.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_15T04_32_24.928446", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-15T04-32-24.928446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-15T04-32-24.928446.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_15T04_32_24.928446", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-15T04-32-24.928446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-15T04-32-24.928446.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_15T04_32_24.928446", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T04-32-24.928446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T04-32-24.928446.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_15T04_32_24.928446", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-15T04-32-24.928446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-15T04-32-24.928446.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_15T04_32_24.928446", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T04-32-24.928446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T04-32-24.928446.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_15T04_32_24.928446", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T04-32-24.928446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T04-32-24.928446.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_15T04_32_24.928446", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T04-32-24.928446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T04-32-24.928446.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_15T04_32_24.928446", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-15T04-32-24.928446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-15T04-32-24.928446.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_15T04_32_24.928446", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T04-32-24.928446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T04-32-24.928446.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_15T04_32_24.928446", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T04-32-24.928446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T04-32-24.928446.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_15T04_32_24.928446", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T04-32-24.928446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T04-32-24.928446.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_15T04_32_24.928446", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T04-32-24.928446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T04-32-24.928446.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_15T04_32_24.928446", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T04-32-24.928446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T04-32-24.928446.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_15T04_32_24.928446", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T04-32-24.928446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T04-32-24.928446.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_15T04_32_24.928446", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T04-32-24.928446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T04-32-24.928446.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_15T04_32_24.928446", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T04-32-24.928446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T04-32-24.928446.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_15T04_32_24.928446", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T04-32-24.928446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T04-32-24.928446.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_15T04_32_24.928446", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T04-32-24.928446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T04-32-24.928446.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_15T04_32_24.928446", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T04-32-24.928446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T04-32-24.928446.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_15T04_32_24.928446", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T04-32-24.928446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T04-32-24.928446.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_15T04_32_24.928446", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T04-32-24.928446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T04-32-24.928446.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_15T04_32_24.928446", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T04-32-24.928446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T04-32-24.928446.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_15T04_32_24.928446", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-15T04-32-24.928446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-15T04-32-24.928446.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_15T04_32_24.928446", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T04-32-24.928446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T04-32-24.928446.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_15T04_32_24.928446", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-15T04-32-24.928446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-15T04-32-24.928446.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_15T04_32_24.928446", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T04-32-24.928446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T04-32-24.928446.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_15T04_32_24.928446", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T04-32-24.928446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T04-32-24.928446.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_15T04_32_24.928446", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T04-32-24.928446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T04-32-24.928446.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_15T04_32_24.928446", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-15T04-32-24.928446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-15T04-32-24.928446.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_15T04_32_24.928446", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-15T04-32-24.928446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-15T04-32-24.928446.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_15T04_32_24.928446", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T04-32-24.928446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T04-32-24.928446.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_15T04_32_24.928446", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T04-32-24.928446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T04-32-24.928446.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_15T04_32_24.928446", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T04-32-24.928446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T04-32-24.928446.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_15T04_32_24.928446", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T04-32-24.928446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T04-32-24.928446.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_15T04_32_24.928446", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-15T04-32-24.928446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-15T04-32-24.928446.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_15T04_32_24.928446", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-15T04-32-24.928446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-15T04-32-24.928446.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_15T04_32_24.928446", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-15T04-32-24.928446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-15T04-32-24.928446.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_15T04_32_24.928446", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T04-32-24.928446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T04-32-24.928446.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_15T04_32_24.928446", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-15T04-32-24.928446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-15T04-32-24.928446.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_15T04_32_24.928446", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T04-32-24.928446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T04-32-24.928446.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_15T04_32_24.928446", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T04-32-24.928446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T04-32-24.928446.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_15T04_32_24.928446", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-15T04-32-24.928446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-15T04-32-24.928446.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_15T04_32_24.928446", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-15T04-32-24.928446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-15T04-32-24.928446.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_15T04_32_24.928446", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-15T04-32-24.928446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-15T04-32-24.928446.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_15T04_32_24.928446", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T04-32-24.928446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T04-32-24.928446.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_15T04_32_24.928446", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-15T04-32-24.928446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-15T04-32-24.928446.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_15T04_32_24.928446", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-15T04-32-24.928446.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-15T04-32-24.928446.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_15T04_32_24.928446", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-15T04-32-24.928446.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-15T04-32-24.928446.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_15T04_32_24.928446", "path": ["**/details_harness|winogrande|5_2024-02-15T04-32-24.928446.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-15T04-32-24.928446.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_15T04_32_24.928446", "path": ["results_2024-02-15T04-32-24.928446.parquet"]}, {"split": "latest", "path": ["results_2024-02-15T04-32-24.928446.parquet"]}]}]} | 2024-02-15T04:35:04+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of DreadPoor/ToppyLake-Bagel-7B-slerp
Dataset automatically created during the evaluation run of model DreadPoor/ToppyLake-Bagel-7B-slerp on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-15T04:32:24.928446(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of DreadPoor/ToppyLake-Bagel-7B-slerp\n\n\n\nDataset automatically created during the evaluation run of model DreadPoor/ToppyLake-Bagel-7B-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-15T04:32:24.928446(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of DreadPoor/ToppyLake-Bagel-7B-slerp\n\n\n\nDataset automatically created during the evaluation run of model DreadPoor/ToppyLake-Bagel-7B-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-15T04:32:24.928446(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
592fba2f02d85a672dde4e5789356f3b429375b7 | # Dataset Card for Dataset Name
summary
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
use this dataset to train a module
- **Curated by:** Le Minh
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** Llama2
- **License:**LLama2
### Dataset Sources [optional]
a
- **Repository:** github
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
train llama2 model
### Direct Use
load it and train
[More Information Needed]
### Out-of-Scope Use
text prediction
[More Information Needed]
## Dataset Structure
ID, summary and dialogue
[More Information Needed]
## Dataset Creation
### Curation Rationale
train LLM
[More Information Needed]
### Source Data
internet
#### Data Collection and Processing
conversation
[More Information Needed]
#### Who are the source data producers?
copy from samsum
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
train annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
Le Minh
[More Information Needed]
#### Personal and Sensitive Information
no
[More Information Needed]
## Bias, Risks, and Limitations
no risk
[More Information Needed]
### Recommendations
do not use it
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | LeMinhAtSJSU/NewData | [
"task_categories:text-classification",
"size_categories:10K<n<100K",
"language:en",
"license:other",
"region:us"
] | 2024-02-15T04:46:39+00:00 | {"language": ["en"], "license": "other", "size_categories": ["10K<n<100K"], "task_categories": ["text-classification"]} | 2024-02-15T04:56:13+00:00 | [] | [
"en"
] | TAGS
#task_categories-text-classification #size_categories-10K<n<100K #language-English #license-other #region-us
| # Dataset Card for Dataset Name
summary
This dataset card aims to be a base template for new datasets. It has been generated using this raw template.
## Dataset Details
### Dataset Description
use this dataset to train a module
- Curated by: Le Minh
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP): Llama2
- License:LLama2
### Dataset Sources [optional]
a
- Repository: github
- Paper [optional]:
- Demo [optional]:
## Uses
train llama2 model
### Direct Use
load it and train
### Out-of-Scope Use
text prediction
## Dataset Structure
ID, summary and dialogue
## Dataset Creation
### Curation Rationale
train LLM
### Source Data
internet
#### Data Collection and Processing
conversation
#### Who are the source data producers?
copy from samsum
### Annotations [optional]
#### Annotation process
train annotators, interannotator statistics, annotation validation, etc. -->
#### Who are the annotators?
Le Minh
#### Personal and Sensitive Information
no
## Bias, Risks, and Limitations
no risk
### Recommendations
do not use it
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Dataset Name\n\nsummary\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.",
"## Dataset Details",
"### Dataset Description\n\nuse this dataset to train a module\n\n\n\n- Curated by: Le Minh\n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): Llama2\n- License:LLama2",
"### Dataset Sources [optional]\n\na\n\n- Repository: github\n- Paper [optional]: \n- Demo [optional]:",
"## Uses\n\ntrain llama2 model",
"### Direct Use\n\nload it and train",
"### Out-of-Scope Use\n\ntext prediction",
"## Dataset Structure\n\nID, summary and dialogue",
"## Dataset Creation",
"### Curation Rationale\n\ntrain LLM",
"### Source Data\n\ninternet",
"#### Data Collection and Processing\n\nconversation",
"#### Who are the source data producers?\n\ncopy from samsum",
"### Annotations [optional]",
"#### Annotation process\n\ntrain annotators, interannotator statistics, annotation validation, etc. -->",
"#### Who are the annotators?\n\nLe Minh",
"#### Personal and Sensitive Information\n\nno",
"## Bias, Risks, and Limitations\n\nno risk",
"### Recommendations\n\ndo not use it\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#task_categories-text-classification #size_categories-10K<n<100K #language-English #license-other #region-us \n",
"# Dataset Card for Dataset Name\n\nsummary\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.",
"## Dataset Details",
"### Dataset Description\n\nuse this dataset to train a module\n\n\n\n- Curated by: Le Minh\n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): Llama2\n- License:LLama2",
"### Dataset Sources [optional]\n\na\n\n- Repository: github\n- Paper [optional]: \n- Demo [optional]:",
"## Uses\n\ntrain llama2 model",
"### Direct Use\n\nload it and train",
"### Out-of-Scope Use\n\ntext prediction",
"## Dataset Structure\n\nID, summary and dialogue",
"## Dataset Creation",
"### Curation Rationale\n\ntrain LLM",
"### Source Data\n\ninternet",
"#### Data Collection and Processing\n\nconversation",
"#### Who are the source data producers?\n\ncopy from samsum",
"### Annotations [optional]",
"#### Annotation process\n\ntrain annotators, interannotator statistics, annotation validation, etc. -->",
"#### Who are the annotators?\n\nLe Minh",
"#### Personal and Sensitive Information\n\nno",
"## Bias, Risks, and Limitations\n\nno risk",
"### Recommendations\n\ndo not use it\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
69bf254b2c4eda1f04d9e7a66153800caf17cc80 |
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | Jiwonny29/test-dataset | [
"task_categories:feature-extraction",
"size_categories:100K<n<1M",
"language:en",
"license:apache-2.0",
"biology",
"region:us"
] | 2024-02-15T04:58:01+00:00 | {"language": ["en"], "license": "apache-2.0", "size_categories": ["100K<n<1M"], "task_categories": ["feature-extraction"], "pretty_name": "test", "tags": ["biology"], "dataset_info": {"config_name": "mydata", "features": [{"name": "Year", "dtype": "integer"}, {"name": "LocationAbbr", "dtype": "string"}, {"name": "LocationDesc", "dtype": "string"}, {"name": "Geolocation", "dtype": "tuple"}, {"name": "Disease_Type", "dtype": "integer"}, {"name": "Data_Value_Type", "dtype": "integer"}, {"name": "Data_Value", "dtype": "float"}, {"name": "Break_Out_Category", "dtype": "string"}, {"name": "Break_Out_Details", "dtype": "string"}, {"name": "Break_Out_Type", "dtype": "integer"}, {"name": "Life_Expectancy", "dtype": "float"}]}} | 2024-02-16T19:48:50+00:00 | [] | [
"en"
] | TAGS
#task_categories-feature-extraction #size_categories-100K<n<1M #language-English #license-apache-2.0 #biology #region-us
|
# Dataset Card for Dataset Name
This dataset card aims to be a base template for new datasets. It has been generated using this raw template.
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#task_categories-feature-extraction #size_categories-100K<n<1M #language-English #license-apache-2.0 #biology #region-us \n",
"# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
1c4b925d9ea18003046db2aba53fa11fcf3f48a5 | ## Introduction
We release the annotated data used in [Dissecting Human and LLM Preferences](https://arxiv.org/abs/).
*Original Dataset* - The dataset is based on [lmsys/chatbot_arena_conversations](https://huggingface.co/datasets/lmsys/chatbot_arena_conversations), which contains 33K cleaned conversations with pairwise human preferences collected from 13K unique IP addresses on the [Chatbot Arena](https://lmsys.org/blog/2023-05-03-arena/) from April to June 2023.
*Filtering and Scenario-wise Sampling* - We filter out the conversations that are not in English, with "Tie" or "Both Bad" labels, and the multi-turn conversations. We first sample 400 samples with unsafe queries according to the OpenAI moderation API tags and the additional toxic tags in the original dataset, then we apply [Auto-J's scenario classifier](https://huggingface.co/GAIR/autoj-scenario-classifier) to determine the scenario of each sample (we merge the Auto-J's scenarios into 10 new ones). For the *Knowledge-aware* and *Others* scenarios, we pick 820 samples, and for the other scenarios, we pick 400 samples. The total number is 5,240.
*Collecting Preferences* - Besides the human preference labels in this original dataset, we also collect the binary preference labels from 32 LLMs, including 2 proprietary LLMs and 30 open-source ones.
*Annotation on Defined Properties* - We define a set of 29 properties, we annotate how each property is satisfied (in Likert scale rating or property-specific annotation) in all responses ($5,240\times 2=10,480$). See our paper for more details of the defined properties.
## Dataset Overview
An example of the json format is as follows:
```json
{
"query": "...",
"scenario_auto-j": "...",
"scenario_group": "...",
"response_1": {
"content": "...",
"model": "...",
"num_words": "..."
},
"response_2": {...},
"gpt-4-turbo_reference": "...",
"clear intent": "Yes/No",
"explicitly express feelings": "Yes/No",
"explicit constraints": [
...
],
"explicit subjective stances": [
...
],
"explicit mistakes or biases": [
...
],
"preference_labels": {
"human": "response_1/response_2",
"gpt-4-turbo": "response_1/response_2",
...
},
"basic_response_1": {
"admit limitations or mistakes": 0/1/2/3,
"authoritative tone": 0/1/2/3,
...
},
"basic_response_2": {...},
"errors_response_1": {
"applicable or not": "applicable/not applicable",
"errors":[
{
"brief description": "...",
"severity": "severe/moderate/minor",
"type": "...",
},
...
]
},
"errors_response_2": {...},
"query-specific_response_1": {
"clarify user intent": ...,
"correcting explicit mistakes or biases": None,
"satisfying explicit constraints": [
...
],
"showing empathetic": [
...
],
"supporting explicit subjective stances": [
...
]
},
"query-specific_response_2": {...}
}
```
The following fields are basic information:
- **query**: The user query.
- **scenario_auto-j**: The scenario classified by Auto-J's classifier.
- **scenario_group**: One of the 10 new scenarios we merged from the Auto-J's scenario, including an *Unsafe Query* scenario.
- **response_1/response_2**: The content of a response:
- **content**: The text content.
- **model**: The model that generate this response.
- **num_words**: The number of words of this response, determined by NLTK.
- **gpt-4-turbo_reference**: An reference response generated by GPT-4-Turbo.
The following fields are Query-Specific prerequisites. For the last three, there may be an empty list if there is no constraints/stances/mistakes.
- **clear intent**: Whether the intent of the user is clearly expressed in the query, "Yes" or "No".
- **explicitly express feelings**: Whether the user clearly express his/her feelings or emotions in the query, "Yes" or "No".
- **explicit constraints**": A list containing all the explicit constraints in the query.
- **explicit subjective stances**: A list containing all the subjective stances in the query.
- **explicit mistakes or biases**: A list containing all the mistakes or biases in the query.
The following fields are the main body of the annotation.
- **preference_labels**: The preference label for each judge (human or an LLM) indicating which response is preferred in a pair, "response_1/response_2".
- **basic_response_1/basic_response_2**: The annotated ratings of the 20 basic properties (except *lengthy*) for the response.
- **property_name**: 0/1/2/3
- ...
- **errors_response_1/errors_response_2**: The detected errors of the response.
- **applicable or not**: If GPT-4-Turbo find itself can reliably detect the errors in the response.
- **errors**: A list containing the detected errors in the response.
- **brief description**: A brief description of the error.
- **severity**: How much the error affect the overall correctness of the response, "severe/moderate/minor".
- **type**: The type of the error, "factual error/information contradiction to the query/math operation error/code generation error"
- **query-specific_response_1/query-specific_response_2**: The annotation results of the Query-Specific properties.
- **clarify user intent**: If the user intent is not clear, rate how much the response help clarify the intent, 0/1/2/3.
- **showing empathetic**: If the user expresses feelings or emotions, rate how much the response show empathetic, 0/1/2/3.
- **satisfying explicit constraints**: If there are explicit constraints in the query, rate how much the response satisfy each of them.
- A list of "{description of constraint} | 0/1/2/3"
- **correcting explicit mistakes or biases**: If there are mistakes of biases in the query, classify how the response correct each of them
- A list of "{description of mistake} | Pointed out and corrected/Pointed out but not corrected/Corrected without being pointed out/Neither pointed out nor corrected"
- **supporting explicit subjective stances**: If there are subject stances in the query, classify how the response support each of them
- A list of "{description of stance} | Strongly supported/Weakly supported/Neutral/Weakly opposed/Strongly opposed"
## Statistics
👇 Number of samples meeting 5 Query-specific prerequisites.
| Prerequisite | # | Prerequisite | # |
| ------------------------- | ----- | ---------------- | ---- |
| with explicit constraints | 1,418 | unclear intent | 459 |
| show subjective stances | 388 | express feelings | 121 |
| contain mistakes or bias | 401 | | |
👇 Mean Score/Count for each property in collected data. *The average scores of 5 query-specific properties are calculated only on samples where the queries met specific prerequisites.
| Property | Mean Score/Count | Property | Mean Score/Count |
| ---------------------------- | ---------------- | ---------------------------- | ---------------- |
| **Mean Score** | |
| harmless | 2.90 | persuasive | 0.27 |
| grammarly correct | 2.70 | step-by-step | 0.37 |
| friendly | 1.79 | use informal expressions | 0.04 |
| polite | 2.78 | clear | 2.54 |
| interactive | 0.22 | contain rich information | 1.74 |
| authoritative | 1.67 | novel | 0.47 |
| funny | 0.08 | relevant | 2.45 |
| use rhetorical devices | 0.16 | clarify intent* | 1.33 |
| complex word & sentence | 0.89 | show empathetic* | 1.48 |
| use supporting materials | 0.13 | satisfy constraints* | 2.01 |
| well formatted | 1.26 | support stances* | 2.28 |
| admit limits | 0.17 | correct mistakes* | 1.08 |
| **Mean Count** | |
| severe errors | 0.59 | minor errors | 0.23 |
| moderate errors | 0.61 | length | 164.52 |
👇 Property correlation in the annotated data.
<img src="./property_corr.PNG" alt="image-20240213145030747" style="zoom: 50%;" />
## Disclaimers and Terms
**This part is copied from the original dataset*
- **This dataset contains conversations that may be considered unsafe, offensive, or upsetting.** It is not intended for training dialogue agents without applying appropriate filtering measures. We are not responsible for any outputs of the models trained on this dataset.
- Statements or opinions made in this dataset do not reflect the views of researchers or institutions involved in the data collection effort.
- Users of this data are responsible for ensuring its appropriate use, which includes abiding by any applicable laws and regulations.
- Users of this data should adhere to the terms of use for a specific model when using its direct outputs.
- Users of this data agree to not attempt to determine the identity of individuals in this dataset.
## License
Following the original dataset, this dataset is licensed under CC-BY-NC-4.0.
| Preference-Dissection/preference-dissection | [
"language:en",
"license:cc-by-nc-4.0",
"region:us"
] | 2024-02-15T05:05:25+00:00 | {"language": ["en"], "license": "cc-by-nc-4.0", "pretty_name": "Preference Dissection", "dataset_info": {"features": [{"name": "query", "dtype": "string"}, {"name": "scenario_auto-j", "dtype": "string"}, {"name": "scenario_group", "dtype": "string"}, {"name": "response_1", "struct": [{"name": "content", "dtype": "string"}, {"name": "model", "dtype": "string"}, {"name": "num_words", "dtype": "int64"}]}, {"name": "response_2", "struct": [{"name": "content", "dtype": "string"}, {"name": "model", "dtype": "string"}, {"name": "num_words", "dtype": "int64"}]}, {"name": "gpt-4-turbo_reference", "dtype": "string"}, {"name": "clear intent", "dtype": "string"}, {"name": "explicitly express feelings", "dtype": "string"}, {"name": "explicit constraints", "sequence": "string"}, {"name": "explicit subjective stances", "sequence": "string"}, {"name": "explicit mistakes or biases", "sequence": "string"}, {"name": "preference_labels", "struct": [{"name": "gpt-3.5-turbo-1106", "dtype": "string"}, {"name": "gpt-4-1106-preview", "dtype": "string"}, {"name": "human", "dtype": "string"}, {"name": "llama-2-13b", "dtype": "string"}, {"name": "llama-2-13b-chat", "dtype": "string"}, {"name": "llama-2-70b", "dtype": "string"}, {"name": "llama-2-70b-chat", "dtype": "string"}, {"name": "llama-2-7b", "dtype": "string"}, {"name": "llama-2-7b-chat", "dtype": "string"}, {"name": "mistral-7b", "dtype": "string"}, {"name": "mistral-7b-instruct-v0.1", "dtype": "string"}, {"name": "mistral-7b-instruct-v0.2", "dtype": "string"}, {"name": "mistral-8x7b", "dtype": "string"}, {"name": "mistral-8x7b-instruct-v0.1", "dtype": "string"}, {"name": "qwen-14b", "dtype": "string"}, {"name": "qwen-14b-chat", "dtype": "string"}, {"name": "qwen-72b", "dtype": "string"}, {"name": "qwen-72b-chat", "dtype": "string"}, {"name": "qwen-7b", "dtype": "string"}, {"name": "qwen-7b-chat", "dtype": "string"}, {"name": "tulu-2-dpo-13b", "dtype": "string"}, {"name": "tulu-2-dpo-70b", "dtype": "string"}, {"name": "tulu-2-dpo-7b", "dtype": "string"}, {"name": "vicuna-13b-v1.5", "dtype": "string"}, {"name": "vicuna-7b-v1.5", "dtype": "string"}, {"name": "wizardLM-13b-v1.2", "dtype": "string"}, {"name": "wizardLM-70b-v1.0", "dtype": "string"}, {"name": "yi-34b", "dtype": "string"}, {"name": "yi-34b-chat", "dtype": "string"}, {"name": "yi-6b", "dtype": "string"}, {"name": "yi-6b-chat", "dtype": "string"}, {"name": "zephyr-7b-alpha", "dtype": "string"}, {"name": "zephyr-7b-beta", "dtype": "string"}]}, {"name": "basic_response_1", "struct": [{"name": "admit limitations or mistakes", "dtype": "int64"}, {"name": "authoritative tone", "dtype": "int64"}, {"name": "clear and understandable", "dtype": "int64"}, {"name": "complex word usage and sentence structure", "dtype": "int64"}, {"name": "friendly", "dtype": "int64"}, {"name": "funny and humorous", "dtype": "int64"}, {"name": "grammar, spelling, punctuation, and code-switching", "dtype": "int64"}, {"name": "harmlessness", "dtype": "int64"}, {"name": "information richness without considering inaccuracy", "dtype": "int64"}, {"name": "innovative and novel", "dtype": "int64"}, {"name": "interactive", "dtype": "int64"}, {"name": "metaphors, personification, similes, hyperboles, irony, parallelism", "dtype": "int64"}, {"name": "persuade user", "dtype": "int64"}, {"name": "polite", "dtype": "int64"}, {"name": "relevance without considering inaccuracy", "dtype": "int64"}, {"name": "repetitive", "dtype": "int64"}, {"name": "step by step solution", "dtype": "int64"}, {"name": "use of direct and explicit supporting materials", "dtype": "int64"}, {"name": "use of informal expressions", "dtype": "int64"}, {"name": "well formatted", "dtype": "int64"}]}, {"name": "basic_response_2", "struct": [{"name": "admit limitations or mistakes", "dtype": "int64"}, {"name": "authoritative tone", "dtype": "int64"}, {"name": "clear and understandable", "dtype": "int64"}, {"name": "complex word usage and sentence structure", "dtype": "int64"}, {"name": "friendly", "dtype": "int64"}, {"name": "funny and humorous", "dtype": "int64"}, {"name": "grammar, spelling, punctuation, and code-switching", "dtype": "int64"}, {"name": "harmlessness", "dtype": "int64"}, {"name": "information richness without considering inaccuracy", "dtype": "int64"}, {"name": "innovative and novel", "dtype": "int64"}, {"name": "interactive", "dtype": "int64"}, {"name": "metaphors, personification, similes, hyperboles, irony, parallelism", "dtype": "int64"}, {"name": "persuade user", "dtype": "int64"}, {"name": "polite", "dtype": "int64"}, {"name": "relevance without considering inaccuracy", "dtype": "int64"}, {"name": "repetitive", "dtype": "int64"}, {"name": "step by step solution", "dtype": "int64"}, {"name": "use of direct and explicit supporting materials", "dtype": "int64"}, {"name": "use of informal expressions", "dtype": "int64"}, {"name": "well formatted", "dtype": "int64"}]}, {"name": "errors_response_1", "struct": [{"name": "applicable or not", "dtype": "string"}, {"name": "errors", "list": [{"name": "brief description", "dtype": "string"}, {"name": "severity", "dtype": "string"}, {"name": "type", "dtype": "string"}]}]}, {"name": "errors_response_2", "struct": [{"name": "applicable or not", "dtype": "string"}, {"name": "errors", "list": [{"name": "brief description", "dtype": "string"}, {"name": "severity", "dtype": "string"}, {"name": "type", "dtype": "string"}]}]}, {"name": "query-specific_response_1", "struct": [{"name": "clarify user intent", "dtype": "int64"}, {"name": "correcting explicit mistakes or biases", "sequence": "string"}, {"name": "satisfying explicit constraints", "sequence": "string"}, {"name": "showing empathetic", "dtype": "int64"}, {"name": "supporting explicit subjective stances", "sequence": "string"}]}, {"name": "query-specific_response_2", "struct": [{"name": "clarify user intent", "dtype": "int64"}, {"name": "correcting explicit mistakes or biases", "sequence": "string"}, {"name": "satisfying explicit constraints", "sequence": "string"}, {"name": "showing empathetic", "dtype": "int64"}, {"name": "supporting explicit subjective stances", "sequence": "string"}]}], "splits": [{"name": "train", "num_bytes": 27617371, "num_examples": 5240}], "download_size": 13124269, "dataset_size": 27617371}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-15T05:17:10+00:00 | [] | [
"en"
] | TAGS
#language-English #license-cc-by-nc-4.0 #region-us
| Introduction
------------
We release the annotated data used in Dissecting Human and LLM Preferences.
*Original Dataset* - The dataset is based on lmsys/chatbot\_arena\_conversations, which contains 33K cleaned conversations with pairwise human preferences collected from 13K unique IP addresses on the Chatbot Arena from April to June 2023.
*Filtering and Scenario-wise Sampling* - We filter out the conversations that are not in English, with "Tie" or "Both Bad" labels, and the multi-turn conversations. We first sample 400 samples with unsafe queries according to the OpenAI moderation API tags and the additional toxic tags in the original dataset, then we apply Auto-J's scenario classifier to determine the scenario of each sample (we merge the Auto-J's scenarios into 10 new ones). For the *Knowledge-aware* and *Others* scenarios, we pick 820 samples, and for the other scenarios, we pick 400 samples. The total number is 5,240.
*Collecting Preferences* - Besides the human preference labels in this original dataset, we also collect the binary preference labels from 32 LLMs, including 2 proprietary LLMs and 30 open-source ones.
*Annotation on Defined Properties* - We define a set of 29 properties, we annotate how each property is satisfied (in Likert scale rating or property-specific annotation) in all responses ($5,240\times 2=10,480$). See our paper for more details of the defined properties.
Dataset Overview
----------------
An example of the json format is as follows:
The following fields are basic information:
* query: The user query.
* scenario\_auto-j: The scenario classified by Auto-J's classifier.
* scenario\_group: One of the 10 new scenarios we merged from the Auto-J's scenario, including an *Unsafe Query* scenario.
* response\_1/response\_2: The content of a response:
+ content: The text content.
+ model: The model that generate this response.
+ num\_words: The number of words of this response, determined by NLTK.
* gpt-4-turbo\_reference: An reference response generated by GPT-4-Turbo.
The following fields are Query-Specific prerequisites. For the last three, there may be an empty list if there is no constraints/stances/mistakes.
* clear intent: Whether the intent of the user is clearly expressed in the query, "Yes" or "No".
* explicitly express feelings: Whether the user clearly express his/her feelings or emotions in the query, "Yes" or "No".
* explicit constraints": A list containing all the explicit constraints in the query.
* explicit subjective stances: A list containing all the subjective stances in the query.
* explicit mistakes or biases: A list containing all the mistakes or biases in the query.
The following fields are the main body of the annotation.
* preference\_labels: The preference label for each judge (human or an LLM) indicating which response is preferred in a pair, "response\_1/response\_2".
* basic\_response\_1/basic\_response\_2: The annotated ratings of the 20 basic properties (except *lengthy*) for the response.
+ property\_name: 0/1/2/3
+ ...
* errors\_response\_1/errors\_response\_2: The detected errors of the response.
+ applicable or not: If GPT-4-Turbo find itself can reliably detect the errors in the response.
+ errors: A list containing the detected errors in the response.
- brief description: A brief description of the error.
- severity: How much the error affect the overall correctness of the response, "severe/moderate/minor".
- type: The type of the error, "factual error/information contradiction to the query/math operation error/code generation error"
* query-specific\_response\_1/query-specific\_response\_2: The annotation results of the Query-Specific properties.
+ clarify user intent: If the user intent is not clear, rate how much the response help clarify the intent, 0/1/2/3.
+ showing empathetic: If the user expresses feelings or emotions, rate how much the response show empathetic, 0/1/2/3.
+ satisfying explicit constraints: If there are explicit constraints in the query, rate how much the response satisfy each of them.
- A list of "{description of constraint} | 0/1/2/3"
+ correcting explicit mistakes or biases: If there are mistakes of biases in the query, classify how the response correct each of them
- A list of "{description of mistake} | Pointed out and corrected/Pointed out but not corrected/Corrected without being pointed out/Neither pointed out nor corrected"
+ supporting explicit subjective stances: If there are subject stances in the query, classify how the response support each of them
- A list of "{description of stance} | Strongly supported/Weakly supported/Neutral/Weakly opposed/Strongly opposed"
Statistics
----------
Number of samples meeting 5 Query-specific prerequisites.
Mean Score/Count for each property in collected data. \*The average scores of 5 query-specific properties are calculated only on samples where the queries met specific prerequisites.
Property correlation in the annotated data.

Disclaimers and Terms
---------------------
This part is copied from the original dataset\*
* This dataset contains conversations that may be considered unsafe, offensive, or upsetting. It is not intended for training dialogue agents without applying appropriate filtering measures. We are not responsible for any outputs of the models trained on this dataset.
* Statements or opinions made in this dataset do not reflect the views of researchers or institutions involved in the data collection effort.
* Users of this data are responsible for ensuring its appropriate use, which includes abiding by any applicable laws and regulations.
* Users of this data should adhere to the terms of use for a specific model when using its direct outputs.
* Users of this data agree to not attempt to determine the identity of individuals in this dataset.
License
-------
Following the original dataset, this dataset is licensed under CC-BY-NC-4.0.
| [] | [
"TAGS\n#language-English #license-cc-by-nc-4.0 #region-us \n"
] |
ee07492a2d02087ebfdc4fa228231599e671787b |
# Syiah Kuala University Dataset V1
## Source
- [ULT USK](http://ult.usk.ac.id/) (Unit Layanan Terpadu USK)
## Content
1. Data QnA bagi Mahasiswa baru
## Contact
- You Can Contact Me at Email : [email protected] | Haary/usk-qna-v1 | [
"license:llama2",
"region:us"
] | 2024-02-15T05:12:34+00:00 | {"license": "llama2"} | 2024-02-15T05:31:39+00:00 | [] | [] | TAGS
#license-llama2 #region-us
|
# Syiah Kuala University Dataset V1
## Source
- ULT USK (Unit Layanan Terpadu USK)
## Content
1. Data QnA bagi Mahasiswa baru
## Contact
- You Can Contact Me at Email : haryrachmat10@URL | [
"# Syiah Kuala University Dataset V1",
"## Source \n\n- ULT USK (Unit Layanan Terpadu USK)",
"## Content \n\n1. Data QnA bagi Mahasiswa baru",
"## Contact \n\n- You Can Contact Me at Email : haryrachmat10@URL"
] | [
"TAGS\n#license-llama2 #region-us \n",
"# Syiah Kuala University Dataset V1",
"## Source \n\n- ULT USK (Unit Layanan Terpadu USK)",
"## Content \n\n1. Data QnA bagi Mahasiswa baru",
"## Contact \n\n- You Can Contact Me at Email : haryrachmat10@URL"
] |
ff27ff9bdec828bc017a3dfb8499903be4a56101 | # Dataset Card for "comparisons_20k_regen_labeled_dpo1b1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | arianhosseini/comparisons_20k_regen_labeled_dpo1b1 | [
"region:us"
] | 2024-02-15T05:30:56+00:00 | {"dataset_info": {"features": [{"name": "prompt", "dtype": "string"}, {"name": "chosen", "dtype": "string"}, {"name": "rejected", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 36118675, "num_examples": 20000}], "download_size": 20607508, "dataset_size": 36118675}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-15T05:30:59+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "comparisons_20k_regen_labeled_dpo1b1"
More Information needed | [
"# Dataset Card for \"comparisons_20k_regen_labeled_dpo1b1\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"comparisons_20k_regen_labeled_dpo1b1\"\n\nMore Information needed"
] |
d506f7958fe890c4873a7adcc2ddf531c9b50550 |
my_dataset_repository/
├── README.md
├── Clean Data 2023.csv
├── abstract 2023.csv
└── conclusion 2023.csv
└── combined 2023.csv | alwanrahmana/ner_2023 | [
"task_categories:token-classification",
"size_categories:n<1K",
"language:id",
"license:pddl",
"region:us"
] | 2024-02-15T05:38:05+00:00 | {"language": ["id"], "license": "pddl", "size_categories": ["n<1K"], "task_categories": ["token-classification"], "pretty_name": "NER2023"} | 2024-02-15T09:02:58+00:00 | [] | [
"id"
] | TAGS
#task_categories-token-classification #size_categories-n<1K #language-Indonesian #license-pddl #region-us
|
my_dataset_repository/
├── URL
├── Clean Data URL
├── abstract URL
└── conclusion URL
└── combined URL | [] | [
"TAGS\n#task_categories-token-classification #size_categories-n<1K #language-Indonesian #license-pddl #region-us \n"
] |
016c1b8969004a8e799166ade738d74c04bf2f09 |
# Dataset Card for Evaluation run of Changgil/k2s3_test_24001
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Changgil/k2s3_test_24001](https://huggingface.co/Changgil/k2s3_test_24001) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Changgil__k2s3_test_24001",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-15T07:38:41.232311](https://huggingface.co/datasets/open-llm-leaderboard/details_Changgil__k2s3_test_24001/blob/main/results_2024-02-15T07-38-41.232311.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5457607639419929,
"acc_stderr": 0.03381228856533623,
"acc_norm": 0.5506067592536232,
"acc_norm_stderr": 0.03452302087358302,
"mc1": 0.2864137086903305,
"mc1_stderr": 0.015826142439502342,
"mc2": 0.4357245447683409,
"mc2_stderr": 0.01457057655258036
},
"harness|arc:challenge|25": {
"acc": 0.5136518771331058,
"acc_stderr": 0.014605943429860947,
"acc_norm": 0.5571672354948806,
"acc_norm_stderr": 0.014515573873348902
},
"harness|hellaswag|10": {
"acc": 0.6011750647281418,
"acc_stderr": 0.004886559008754983,
"acc_norm": 0.8069109739095798,
"acc_norm_stderr": 0.003939155484500657
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411022,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411022
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5407407407407407,
"acc_stderr": 0.04304979692464242,
"acc_norm": 0.5407407407407407,
"acc_norm_stderr": 0.04304979692464242
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5592105263157895,
"acc_stderr": 0.04040311062490437,
"acc_norm": 0.5592105263157895,
"acc_norm_stderr": 0.04040311062490437
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6037735849056604,
"acc_stderr": 0.030102793781791197,
"acc_norm": 0.6037735849056604,
"acc_norm_stderr": 0.030102793781791197
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5694444444444444,
"acc_stderr": 0.04140685639111503,
"acc_norm": 0.5694444444444444,
"acc_norm_stderr": 0.04140685639111503
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.48554913294797686,
"acc_stderr": 0.03810871630454764,
"acc_norm": 0.48554913294797686,
"acc_norm_stderr": 0.03810871630454764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.046550104113196177,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.046550104113196177
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3829787234042553,
"acc_stderr": 0.03177821250236922,
"acc_norm": 0.3829787234042553,
"acc_norm_stderr": 0.03177821250236922
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.04185774424022056,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.04185774424022056
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.335978835978836,
"acc_stderr": 0.024326310529149138,
"acc_norm": 0.335978835978836,
"acc_norm_stderr": 0.024326310529149138
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.04134913018303317,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.04134913018303317
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6548387096774193,
"acc_stderr": 0.027045746573534327,
"acc_norm": 0.6548387096774193,
"acc_norm_stderr": 0.027045746573534327
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4236453201970443,
"acc_stderr": 0.03476725747649037,
"acc_norm": 0.4236453201970443,
"acc_norm_stderr": 0.03476725747649037
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6606060606060606,
"acc_stderr": 0.03697442205031595,
"acc_norm": 0.6606060606060606,
"acc_norm_stderr": 0.03697442205031595
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6868686868686869,
"acc_stderr": 0.033042050878136525,
"acc_norm": 0.6868686868686869,
"acc_norm_stderr": 0.033042050878136525
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7668393782383419,
"acc_stderr": 0.03051611137147602,
"acc_norm": 0.7668393782383419,
"acc_norm_stderr": 0.03051611137147602
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5076923076923077,
"acc_stderr": 0.025348006031534778,
"acc_norm": 0.5076923076923077,
"acc_norm_stderr": 0.025348006031534778
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.0279404571362284,
"acc_norm": 0.3,
"acc_norm_stderr": 0.0279404571362284
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5462184873949579,
"acc_stderr": 0.03233943468182088,
"acc_norm": 0.5462184873949579,
"acc_norm_stderr": 0.03233943468182088
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.03802039760107903,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.03802039760107903
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7357798165137615,
"acc_stderr": 0.01890416417151019,
"acc_norm": 0.7357798165137615,
"acc_norm_stderr": 0.01890416417151019
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.41203703703703703,
"acc_stderr": 0.03356787758160835,
"acc_norm": 0.41203703703703703,
"acc_norm_stderr": 0.03356787758160835
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7303921568627451,
"acc_stderr": 0.031145570659486782,
"acc_norm": 0.7303921568627451,
"acc_norm_stderr": 0.031145570659486782
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7172995780590717,
"acc_stderr": 0.02931281415395592,
"acc_norm": 0.7172995780590717,
"acc_norm_stderr": 0.02931281415395592
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6233183856502242,
"acc_stderr": 0.032521134899291884,
"acc_norm": 0.6233183856502242,
"acc_norm_stderr": 0.032521134899291884
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5954198473282443,
"acc_stderr": 0.043046937953806645,
"acc_norm": 0.5954198473282443,
"acc_norm_stderr": 0.043046937953806645
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7355371900826446,
"acc_stderr": 0.04026187527591207,
"acc_norm": 0.7355371900826446,
"acc_norm_stderr": 0.04026187527591207
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.04414343666854933,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.04414343666854933
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6687116564417178,
"acc_stderr": 0.03697983910025588,
"acc_norm": 0.6687116564417178,
"acc_norm_stderr": 0.03697983910025588
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3392857142857143,
"acc_stderr": 0.04493949068613539,
"acc_norm": 0.3392857142857143,
"acc_norm_stderr": 0.04493949068613539
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7948717948717948,
"acc_stderr": 0.026453508054040318,
"acc_norm": 0.7948717948717948,
"acc_norm_stderr": 0.026453508054040318
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7535121328224776,
"acc_stderr": 0.01541130876968693,
"acc_norm": 0.7535121328224776,
"acc_norm_stderr": 0.01541130876968693
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6098265895953757,
"acc_stderr": 0.026261677607806642,
"acc_norm": 0.6098265895953757,
"acc_norm_stderr": 0.026261677607806642
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3474860335195531,
"acc_stderr": 0.015925564060208154,
"acc_norm": 0.3474860335195531,
"acc_norm_stderr": 0.015925564060208154
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.027826109307283686,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.027826109307283686
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.594855305466238,
"acc_stderr": 0.027882383791325953,
"acc_norm": 0.594855305466238,
"acc_norm_stderr": 0.027882383791325953
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.027339546640662734,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.027339546640662734
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3829787234042553,
"acc_stderr": 0.02899908090480618,
"acc_norm": 0.3829787234042553,
"acc_norm_stderr": 0.02899908090480618
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3891786179921773,
"acc_stderr": 0.012452613934287012,
"acc_norm": 0.3891786179921773,
"acc_norm_stderr": 0.012452613934287012
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5183823529411765,
"acc_stderr": 0.030352303395351964,
"acc_norm": 0.5183823529411765,
"acc_norm_stderr": 0.030352303395351964
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5375816993464052,
"acc_stderr": 0.020170614974969758,
"acc_norm": 0.5375816993464052,
"acc_norm_stderr": 0.020170614974969758
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302505,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302505
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6326530612244898,
"acc_stderr": 0.03086214492108756,
"acc_norm": 0.6326530612244898,
"acc_norm_stderr": 0.03086214492108756
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7263681592039801,
"acc_stderr": 0.03152439186555402,
"acc_norm": 0.7263681592039801,
"acc_norm_stderr": 0.03152439186555402
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4819277108433735,
"acc_stderr": 0.038899512528272166,
"acc_norm": 0.4819277108433735,
"acc_norm_stderr": 0.038899512528272166
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.031885780176863984,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.031885780176863984
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2864137086903305,
"mc1_stderr": 0.015826142439502342,
"mc2": 0.4357245447683409,
"mc2_stderr": 0.01457057655258036
},
"harness|winogrande|5": {
"acc": 0.7569060773480663,
"acc_stderr": 0.012055665630431037
},
"harness|gsm8k|5": {
"acc": 0.2979529946929492,
"acc_stderr": 0.012597932232914517
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Changgil__k2s3_test_24001 | [
"region:us"
] | 2024-02-15T06:16:30+00:00 | {"pretty_name": "Evaluation run of Changgil/k2s3_test_24001", "dataset_summary": "Dataset automatically created during the evaluation run of model [Changgil/k2s3_test_24001](https://huggingface.co/Changgil/k2s3_test_24001) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Changgil__k2s3_test_24001\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-15T07:38:41.232311](https://huggingface.co/datasets/open-llm-leaderboard/details_Changgil__k2s3_test_24001/blob/main/results_2024-02-15T07-38-41.232311.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5457607639419929,\n \"acc_stderr\": 0.03381228856533623,\n \"acc_norm\": 0.5506067592536232,\n \"acc_norm_stderr\": 0.03452302087358302,\n \"mc1\": 0.2864137086903305,\n \"mc1_stderr\": 0.015826142439502342,\n \"mc2\": 0.4357245447683409,\n \"mc2_stderr\": 0.01457057655258036\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5136518771331058,\n \"acc_stderr\": 0.014605943429860947,\n \"acc_norm\": 0.5571672354948806,\n \"acc_norm_stderr\": 0.014515573873348902\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6011750647281418,\n \"acc_stderr\": 0.004886559008754983,\n \"acc_norm\": 0.8069109739095798,\n \"acc_norm_stderr\": 0.003939155484500657\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411022,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411022\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5407407407407407,\n \"acc_stderr\": 0.04304979692464242,\n \"acc_norm\": 0.5407407407407407,\n \"acc_norm_stderr\": 0.04304979692464242\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5592105263157895,\n \"acc_stderr\": 0.04040311062490437,\n \"acc_norm\": 0.5592105263157895,\n \"acc_norm_stderr\": 0.04040311062490437\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6037735849056604,\n \"acc_stderr\": 0.030102793781791197,\n \"acc_norm\": 0.6037735849056604,\n \"acc_norm_stderr\": 0.030102793781791197\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5694444444444444,\n \"acc_stderr\": 0.04140685639111503,\n \"acc_norm\": 0.5694444444444444,\n \"acc_norm_stderr\": 0.04140685639111503\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.48554913294797686,\n \"acc_stderr\": 0.03810871630454764,\n \"acc_norm\": 0.48554913294797686,\n \"acc_norm_stderr\": 0.03810871630454764\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.046550104113196177,\n \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.046550104113196177\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.3829787234042553,\n \"acc_stderr\": 0.03177821250236922,\n \"acc_norm\": 0.3829787234042553,\n \"acc_norm_stderr\": 0.03177821250236922\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n \"acc_stderr\": 0.04185774424022056,\n \"acc_norm\": 0.2719298245614035,\n \"acc_norm_stderr\": 0.04185774424022056\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.335978835978836,\n \"acc_stderr\": 0.024326310529149138,\n \"acc_norm\": 0.335978835978836,\n \"acc_norm_stderr\": 0.024326310529149138\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30952380952380953,\n \"acc_stderr\": 0.04134913018303317,\n \"acc_norm\": 0.30952380952380953,\n \"acc_norm_stderr\": 0.04134913018303317\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6548387096774193,\n \"acc_stderr\": 0.027045746573534327,\n \"acc_norm\": 0.6548387096774193,\n \"acc_norm_stderr\": 0.027045746573534327\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4236453201970443,\n \"acc_stderr\": 0.03476725747649037,\n \"acc_norm\": 0.4236453201970443,\n \"acc_norm_stderr\": 0.03476725747649037\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6606060606060606,\n \"acc_stderr\": 0.03697442205031595,\n \"acc_norm\": 0.6606060606060606,\n \"acc_norm_stderr\": 0.03697442205031595\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6868686868686869,\n \"acc_stderr\": 0.033042050878136525,\n \"acc_norm\": 0.6868686868686869,\n \"acc_norm_stderr\": 0.033042050878136525\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7668393782383419,\n \"acc_stderr\": 0.03051611137147602,\n \"acc_norm\": 0.7668393782383419,\n \"acc_norm_stderr\": 0.03051611137147602\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5076923076923077,\n \"acc_stderr\": 0.025348006031534778,\n \"acc_norm\": 0.5076923076923077,\n \"acc_norm_stderr\": 0.025348006031534778\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.0279404571362284,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.0279404571362284\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5462184873949579,\n \"acc_stderr\": 0.03233943468182088,\n \"acc_norm\": 0.5462184873949579,\n \"acc_norm_stderr\": 0.03233943468182088\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31788079470198677,\n \"acc_stderr\": 0.03802039760107903,\n \"acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.03802039760107903\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7357798165137615,\n \"acc_stderr\": 0.01890416417151019,\n \"acc_norm\": 0.7357798165137615,\n \"acc_norm_stderr\": 0.01890416417151019\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.41203703703703703,\n \"acc_stderr\": 0.03356787758160835,\n \"acc_norm\": 0.41203703703703703,\n \"acc_norm_stderr\": 0.03356787758160835\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7303921568627451,\n \"acc_stderr\": 0.031145570659486782,\n \"acc_norm\": 0.7303921568627451,\n \"acc_norm_stderr\": 0.031145570659486782\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7172995780590717,\n \"acc_stderr\": 0.02931281415395592,\n \"acc_norm\": 0.7172995780590717,\n \"acc_norm_stderr\": 0.02931281415395592\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6233183856502242,\n \"acc_stderr\": 0.032521134899291884,\n \"acc_norm\": 0.6233183856502242,\n \"acc_norm_stderr\": 0.032521134899291884\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5954198473282443,\n \"acc_stderr\": 0.043046937953806645,\n \"acc_norm\": 0.5954198473282443,\n \"acc_norm_stderr\": 0.043046937953806645\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7355371900826446,\n \"acc_stderr\": 0.04026187527591207,\n \"acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.04026187527591207\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7037037037037037,\n \"acc_stderr\": 0.04414343666854933,\n \"acc_norm\": 0.7037037037037037,\n \"acc_norm_stderr\": 0.04414343666854933\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6687116564417178,\n \"acc_stderr\": 0.03697983910025588,\n \"acc_norm\": 0.6687116564417178,\n \"acc_norm_stderr\": 0.03697983910025588\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3392857142857143,\n \"acc_stderr\": 0.04493949068613539,\n \"acc_norm\": 0.3392857142857143,\n \"acc_norm_stderr\": 0.04493949068613539\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7948717948717948,\n \"acc_stderr\": 0.026453508054040318,\n \"acc_norm\": 0.7948717948717948,\n \"acc_norm_stderr\": 0.026453508054040318\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7535121328224776,\n \"acc_stderr\": 0.01541130876968693,\n \"acc_norm\": 0.7535121328224776,\n \"acc_norm_stderr\": 0.01541130876968693\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6098265895953757,\n \"acc_stderr\": 0.026261677607806642,\n \"acc_norm\": 0.6098265895953757,\n \"acc_norm_stderr\": 0.026261677607806642\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3474860335195531,\n \"acc_stderr\": 0.015925564060208154,\n \"acc_norm\": 0.3474860335195531,\n \"acc_norm_stderr\": 0.015925564060208154\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.027826109307283686,\n \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.027826109307283686\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.594855305466238,\n \"acc_stderr\": 0.027882383791325953,\n \"acc_norm\": 0.594855305466238,\n \"acc_norm_stderr\": 0.027882383791325953\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5925925925925926,\n \"acc_stderr\": 0.027339546640662734,\n \"acc_norm\": 0.5925925925925926,\n \"acc_norm_stderr\": 0.027339546640662734\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3829787234042553,\n \"acc_stderr\": 0.02899908090480618,\n \"acc_norm\": 0.3829787234042553,\n \"acc_norm_stderr\": 0.02899908090480618\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3891786179921773,\n \"acc_stderr\": 0.012452613934287012,\n \"acc_norm\": 0.3891786179921773,\n \"acc_norm_stderr\": 0.012452613934287012\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5183823529411765,\n \"acc_stderr\": 0.030352303395351964,\n \"acc_norm\": 0.5183823529411765,\n \"acc_norm_stderr\": 0.030352303395351964\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5375816993464052,\n \"acc_stderr\": 0.020170614974969758,\n \"acc_norm\": 0.5375816993464052,\n \"acc_norm_stderr\": 0.020170614974969758\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6326530612244898,\n \"acc_stderr\": 0.03086214492108756,\n \"acc_norm\": 0.6326530612244898,\n \"acc_norm_stderr\": 0.03086214492108756\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7263681592039801,\n \"acc_stderr\": 0.03152439186555402,\n \"acc_norm\": 0.7263681592039801,\n \"acc_norm_stderr\": 0.03152439186555402\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4819277108433735,\n \"acc_stderr\": 0.038899512528272166,\n \"acc_norm\": 0.4819277108433735,\n \"acc_norm_stderr\": 0.038899512528272166\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.031885780176863984,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.031885780176863984\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2864137086903305,\n \"mc1_stderr\": 0.015826142439502342,\n \"mc2\": 0.4357245447683409,\n \"mc2_stderr\": 0.01457057655258036\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7569060773480663,\n \"acc_stderr\": 0.012055665630431037\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2979529946929492,\n \"acc_stderr\": 0.012597932232914517\n }\n}\n```", "repo_url": "https://huggingface.co/Changgil/k2s3_test_24001", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_15T06_14_12.620691", "path": ["**/details_harness|arc:challenge|25_2024-02-15T06-14-12.620691.parquet"]}, {"split": "2024_02_15T07_38_41.232311", "path": ["**/details_harness|arc:challenge|25_2024-02-15T07-38-41.232311.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-15T07-38-41.232311.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_15T06_14_12.620691", "path": ["**/details_harness|gsm8k|5_2024-02-15T06-14-12.620691.parquet"]}, {"split": "2024_02_15T07_38_41.232311", "path": ["**/details_harness|gsm8k|5_2024-02-15T07-38-41.232311.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-15T07-38-41.232311.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_15T06_14_12.620691", "path": ["**/details_harness|hellaswag|10_2024-02-15T06-14-12.620691.parquet"]}, {"split": "2024_02_15T07_38_41.232311", "path": ["**/details_harness|hellaswag|10_2024-02-15T07-38-41.232311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-15T07-38-41.232311.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_15T06_14_12.620691", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T06-14-12.620691.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-15T06-14-12.620691.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-15T06-14-12.620691.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T06-14-12.620691.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T06-14-12.620691.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-15T06-14-12.620691.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T06-14-12.620691.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T06-14-12.620691.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T06-14-12.620691.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T06-14-12.620691.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-15T06-14-12.620691.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-15T06-14-12.620691.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T06-14-12.620691.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-15T06-14-12.620691.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T06-14-12.620691.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T06-14-12.620691.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T06-14-12.620691.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-15T06-14-12.620691.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T06-14-12.620691.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T06-14-12.620691.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T06-14-12.620691.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T06-14-12.620691.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T06-14-12.620691.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T06-14-12.620691.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T06-14-12.620691.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T06-14-12.620691.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T06-14-12.620691.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T06-14-12.620691.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T06-14-12.620691.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T06-14-12.620691.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T06-14-12.620691.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T06-14-12.620691.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-15T06-14-12.620691.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T06-14-12.620691.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-15T06-14-12.620691.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T06-14-12.620691.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T06-14-12.620691.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T06-14-12.620691.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-15T06-14-12.620691.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-15T06-14-12.620691.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T06-14-12.620691.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T06-14-12.620691.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T06-14-12.620691.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T06-14-12.620691.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-15T06-14-12.620691.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-15T06-14-12.620691.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-15T06-14-12.620691.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T06-14-12.620691.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-15T06-14-12.620691.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T06-14-12.620691.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T06-14-12.620691.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-15T06-14-12.620691.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-15T06-14-12.620691.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-15T06-14-12.620691.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T06-14-12.620691.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-15T06-14-12.620691.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-15T06-14-12.620691.parquet"]}, {"split": "2024_02_15T07_38_41.232311", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-15T07-38-41.232311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-15T07-38-41.232311.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-15T07-38-41.232311.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_15T06_14_12.620691", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T06-14-12.620691.parquet"]}, {"split": "2024_02_15T07_38_41.232311", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T07-38-41.232311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T07-38-41.232311.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_15T06_14_12.620691", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-15T06-14-12.620691.parquet"]}, {"split": "2024_02_15T07_38_41.232311", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-15T07-38-41.232311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-15T07-38-41.232311.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_15T06_14_12.620691", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-15T06-14-12.620691.parquet"]}, {"split": "2024_02_15T07_38_41.232311", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-15T07-38-41.232311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-15T07-38-41.232311.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_15T06_14_12.620691", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T06-14-12.620691.parquet"]}, {"split": "2024_02_15T07_38_41.232311", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T07-38-41.232311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T07-38-41.232311.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_15T06_14_12.620691", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T06-14-12.620691.parquet"]}, {"split": "2024_02_15T07_38_41.232311", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T07-38-41.232311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T07-38-41.232311.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_15T06_14_12.620691", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-15T06-14-12.620691.parquet"]}, {"split": "2024_02_15T07_38_41.232311", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-15T07-38-41.232311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-15T07-38-41.232311.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_15T06_14_12.620691", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T06-14-12.620691.parquet"]}, {"split": "2024_02_15T07_38_41.232311", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T07-38-41.232311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T07-38-41.232311.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_15T06_14_12.620691", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T06-14-12.620691.parquet"]}, {"split": "2024_02_15T07_38_41.232311", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T07-38-41.232311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T07-38-41.232311.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_15T06_14_12.620691", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T06-14-12.620691.parquet"]}, {"split": "2024_02_15T07_38_41.232311", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T07-38-41.232311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T07-38-41.232311.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_15T06_14_12.620691", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T06-14-12.620691.parquet"]}, {"split": "2024_02_15T07_38_41.232311", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T07-38-41.232311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T07-38-41.232311.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_15T06_14_12.620691", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-15T06-14-12.620691.parquet"]}, {"split": "2024_02_15T07_38_41.232311", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-15T07-38-41.232311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-15T07-38-41.232311.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_15T06_14_12.620691", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-15T06-14-12.620691.parquet"]}, {"split": "2024_02_15T07_38_41.232311", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-15T07-38-41.232311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-15T07-38-41.232311.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_15T06_14_12.620691", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T06-14-12.620691.parquet"]}, {"split": "2024_02_15T07_38_41.232311", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T07-38-41.232311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T07-38-41.232311.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_15T06_14_12.620691", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-15T06-14-12.620691.parquet"]}, {"split": "2024_02_15T07_38_41.232311", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-15T07-38-41.232311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-15T07-38-41.232311.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_15T06_14_12.620691", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T06-14-12.620691.parquet"]}, {"split": "2024_02_15T07_38_41.232311", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T07-38-41.232311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T07-38-41.232311.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_15T06_14_12.620691", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T06-14-12.620691.parquet"]}, {"split": "2024_02_15T07_38_41.232311", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T07-38-41.232311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T07-38-41.232311.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_15T06_14_12.620691", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T06-14-12.620691.parquet"]}, {"split": "2024_02_15T07_38_41.232311", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T07-38-41.232311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T07-38-41.232311.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_15T06_14_12.620691", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-15T06-14-12.620691.parquet"]}, {"split": "2024_02_15T07_38_41.232311", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-15T07-38-41.232311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-15T07-38-41.232311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_15T06_14_12.620691", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T06-14-12.620691.parquet"]}, {"split": "2024_02_15T07_38_41.232311", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T07-38-41.232311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T07-38-41.232311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_15T06_14_12.620691", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T06-14-12.620691.parquet"]}, {"split": "2024_02_15T07_38_41.232311", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T07-38-41.232311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T07-38-41.232311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_15T06_14_12.620691", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T06-14-12.620691.parquet"]}, {"split": "2024_02_15T07_38_41.232311", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T07-38-41.232311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T07-38-41.232311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_15T06_14_12.620691", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T06-14-12.620691.parquet"]}, {"split": "2024_02_15T07_38_41.232311", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T07-38-41.232311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T07-38-41.232311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_15T06_14_12.620691", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T06-14-12.620691.parquet"]}, {"split": "2024_02_15T07_38_41.232311", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T07-38-41.232311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T07-38-41.232311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_15T06_14_12.620691", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T06-14-12.620691.parquet"]}, {"split": "2024_02_15T07_38_41.232311", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T07-38-41.232311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T07-38-41.232311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_15T06_14_12.620691", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T06-14-12.620691.parquet"]}, {"split": "2024_02_15T07_38_41.232311", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T07-38-41.232311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T07-38-41.232311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_15T06_14_12.620691", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T06-14-12.620691.parquet"]}, {"split": "2024_02_15T07_38_41.232311", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T07-38-41.232311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T07-38-41.232311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_15T06_14_12.620691", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T06-14-12.620691.parquet"]}, {"split": "2024_02_15T07_38_41.232311", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T07-38-41.232311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T07-38-41.232311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_15T06_14_12.620691", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T06-14-12.620691.parquet"]}, {"split": "2024_02_15T07_38_41.232311", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T07-38-41.232311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T07-38-41.232311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_15T06_14_12.620691", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T06-14-12.620691.parquet"]}, {"split": "2024_02_15T07_38_41.232311", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T07-38-41.232311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T07-38-41.232311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_15T06_14_12.620691", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T06-14-12.620691.parquet"]}, {"split": "2024_02_15T07_38_41.232311", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T07-38-41.232311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T07-38-41.232311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_15T06_14_12.620691", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T06-14-12.620691.parquet"]}, {"split": "2024_02_15T07_38_41.232311", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T07-38-41.232311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T07-38-41.232311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_15T06_14_12.620691", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T06-14-12.620691.parquet"]}, {"split": "2024_02_15T07_38_41.232311", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T07-38-41.232311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T07-38-41.232311.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_15T06_14_12.620691", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-15T06-14-12.620691.parquet"]}, {"split": "2024_02_15T07_38_41.232311", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-15T07-38-41.232311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-15T07-38-41.232311.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_15T06_14_12.620691", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T06-14-12.620691.parquet"]}, {"split": "2024_02_15T07_38_41.232311", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T07-38-41.232311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T07-38-41.232311.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_15T06_14_12.620691", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-15T06-14-12.620691.parquet"]}, {"split": "2024_02_15T07_38_41.232311", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-15T07-38-41.232311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-15T07-38-41.232311.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_15T06_14_12.620691", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T06-14-12.620691.parquet"]}, {"split": "2024_02_15T07_38_41.232311", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T07-38-41.232311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T07-38-41.232311.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_15T06_14_12.620691", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T06-14-12.620691.parquet"]}, {"split": "2024_02_15T07_38_41.232311", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T07-38-41.232311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T07-38-41.232311.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_15T06_14_12.620691", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T06-14-12.620691.parquet"]}, {"split": "2024_02_15T07_38_41.232311", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T07-38-41.232311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T07-38-41.232311.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_15T06_14_12.620691", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-15T06-14-12.620691.parquet"]}, {"split": "2024_02_15T07_38_41.232311", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-15T07-38-41.232311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-15T07-38-41.232311.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_15T06_14_12.620691", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-15T06-14-12.620691.parquet"]}, {"split": "2024_02_15T07_38_41.232311", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-15T07-38-41.232311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-15T07-38-41.232311.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_15T06_14_12.620691", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T06-14-12.620691.parquet"]}, {"split": "2024_02_15T07_38_41.232311", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T07-38-41.232311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T07-38-41.232311.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_15T06_14_12.620691", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T06-14-12.620691.parquet"]}, {"split": "2024_02_15T07_38_41.232311", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T07-38-41.232311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T07-38-41.232311.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_15T06_14_12.620691", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T06-14-12.620691.parquet"]}, {"split": "2024_02_15T07_38_41.232311", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T07-38-41.232311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T07-38-41.232311.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_15T06_14_12.620691", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T06-14-12.620691.parquet"]}, {"split": "2024_02_15T07_38_41.232311", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T07-38-41.232311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T07-38-41.232311.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_15T06_14_12.620691", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-15T06-14-12.620691.parquet"]}, {"split": "2024_02_15T07_38_41.232311", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-15T07-38-41.232311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-15T07-38-41.232311.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_15T06_14_12.620691", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-15T06-14-12.620691.parquet"]}, {"split": "2024_02_15T07_38_41.232311", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-15T07-38-41.232311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-15T07-38-41.232311.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_15T06_14_12.620691", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-15T06-14-12.620691.parquet"]}, {"split": "2024_02_15T07_38_41.232311", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-15T07-38-41.232311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-15T07-38-41.232311.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_15T06_14_12.620691", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T06-14-12.620691.parquet"]}, {"split": "2024_02_15T07_38_41.232311", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T07-38-41.232311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T07-38-41.232311.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_15T06_14_12.620691", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-15T06-14-12.620691.parquet"]}, {"split": "2024_02_15T07_38_41.232311", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-15T07-38-41.232311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-15T07-38-41.232311.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_15T06_14_12.620691", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T06-14-12.620691.parquet"]}, {"split": "2024_02_15T07_38_41.232311", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T07-38-41.232311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T07-38-41.232311.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_15T06_14_12.620691", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T06-14-12.620691.parquet"]}, {"split": "2024_02_15T07_38_41.232311", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T07-38-41.232311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T07-38-41.232311.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_15T06_14_12.620691", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-15T06-14-12.620691.parquet"]}, {"split": "2024_02_15T07_38_41.232311", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-15T07-38-41.232311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-15T07-38-41.232311.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_15T06_14_12.620691", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-15T06-14-12.620691.parquet"]}, {"split": "2024_02_15T07_38_41.232311", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-15T07-38-41.232311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-15T07-38-41.232311.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_15T06_14_12.620691", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-15T06-14-12.620691.parquet"]}, {"split": "2024_02_15T07_38_41.232311", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-15T07-38-41.232311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-15T07-38-41.232311.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_15T06_14_12.620691", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T06-14-12.620691.parquet"]}, {"split": "2024_02_15T07_38_41.232311", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T07-38-41.232311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T07-38-41.232311.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_15T06_14_12.620691", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-15T06-14-12.620691.parquet"]}, {"split": "2024_02_15T07_38_41.232311", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-15T07-38-41.232311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-15T07-38-41.232311.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_15T06_14_12.620691", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-15T06-14-12.620691.parquet"]}, {"split": "2024_02_15T07_38_41.232311", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-15T07-38-41.232311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-15T07-38-41.232311.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_15T06_14_12.620691", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-15T06-14-12.620691.parquet"]}, {"split": "2024_02_15T07_38_41.232311", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-15T07-38-41.232311.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-15T07-38-41.232311.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_15T06_14_12.620691", "path": ["**/details_harness|winogrande|5_2024-02-15T06-14-12.620691.parquet"]}, {"split": "2024_02_15T07_38_41.232311", "path": ["**/details_harness|winogrande|5_2024-02-15T07-38-41.232311.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-15T07-38-41.232311.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_15T06_14_12.620691", "path": ["results_2024-02-15T06-14-12.620691.parquet"]}, {"split": "2024_02_15T07_38_41.232311", "path": ["results_2024-02-15T07-38-41.232311.parquet"]}, {"split": "latest", "path": ["results_2024-02-15T07-38-41.232311.parquet"]}]}]} | 2024-02-15T07:41:19+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Changgil/k2s3_test_24001
Dataset automatically created during the evaluation run of model Changgil/k2s3_test_24001 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-15T07:38:41.232311(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Changgil/k2s3_test_24001\n\n\n\nDataset automatically created during the evaluation run of model Changgil/k2s3_test_24001 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-15T07:38:41.232311(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Changgil/k2s3_test_24001\n\n\n\nDataset automatically created during the evaluation run of model Changgil/k2s3_test_24001 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-15T07:38:41.232311(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
bd9e7fc6eab13a85a815dc37d98a786025af6af8 |
# Dataset Card for Evaluation run of FelixChao/Capricorn-7B-DPO
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [FelixChao/Capricorn-7B-DPO](https://huggingface.co/FelixChao/Capricorn-7B-DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_FelixChao__Capricorn-7B-DPO",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-15T06:20:03.216862](https://huggingface.co/datasets/open-llm-leaderboard/details_FelixChao__Capricorn-7B-DPO/blob/main/results_2024-02-15T06-20-03.216862.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6488413542940337,
"acc_stderr": 0.03217471597461982,
"acc_norm": 0.64849979912089,
"acc_norm_stderr": 0.03284318919319882,
"mc1": 0.6119951040391677,
"mc1_stderr": 0.01705876150134798,
"mc2": 0.7723177165257333,
"mc2_stderr": 0.013804607975615193
},
"harness|arc:challenge|25": {
"acc": 0.7005119453924915,
"acc_stderr": 0.01338502163731357,
"acc_norm": 0.7286689419795221,
"acc_norm_stderr": 0.012993807727545796
},
"harness|hellaswag|10": {
"acc": 0.6967735510854411,
"acc_stderr": 0.004587128273935072,
"acc_norm": 0.8846843258315077,
"acc_norm_stderr": 0.0031874975090874164
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.041539484047423976,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.041539484047423976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.02794321998933713,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.02794321998933713
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7847222222222222,
"acc_stderr": 0.03437079344106135,
"acc_norm": 0.7847222222222222,
"acc_norm_stderr": 0.03437079344106135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6820809248554913,
"acc_stderr": 0.0355068398916558,
"acc_norm": 0.6820809248554913,
"acc_norm_stderr": 0.0355068398916558
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107224,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107224
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5574468085106383,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.5574468085106383,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.047028804320496165,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.047028804320496165
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.025379524910778405,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.025379524910778405
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677171,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677171
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7741935483870968,
"acc_stderr": 0.023785577884181012,
"acc_norm": 0.7741935483870968,
"acc_norm_stderr": 0.023785577884181012
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.793939393939394,
"acc_stderr": 0.0315841532404771,
"acc_norm": 0.793939393939394,
"acc_norm_stderr": 0.0315841532404771
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586818,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586818
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.676923076923077,
"acc_stderr": 0.02371088850197057,
"acc_norm": 0.676923076923077,
"acc_norm_stderr": 0.02371088850197057
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.02857834836547308,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.02857834836547308
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6638655462184874,
"acc_stderr": 0.030684737115135363,
"acc_norm": 0.6638655462184874,
"acc_norm_stderr": 0.030684737115135363
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8403669724770643,
"acc_stderr": 0.01570349834846177,
"acc_norm": 0.8403669724770643,
"acc_norm_stderr": 0.01570349834846177
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49537037037037035,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.49537037037037035,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.025524722324553346,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.025524722324553346
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.025955020841621126,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.025955020841621126
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.03149384670994131,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.03149384670994131
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.0349814938546247,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.0349814938546247
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.039418975265163025,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.039418975265163025
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4017857142857143,
"acc_stderr": 0.04653333146973646,
"acc_norm": 0.4017857142857143,
"acc_norm_stderr": 0.04653333146973646
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.021901905115073325,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.021901905115073325
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8237547892720306,
"acc_stderr": 0.013625556907993464,
"acc_norm": 0.8237547892720306,
"acc_norm_stderr": 0.013625556907993464
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7138728323699421,
"acc_stderr": 0.024332146779134128,
"acc_norm": 0.7138728323699421,
"acc_norm_stderr": 0.024332146779134128
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4245810055865922,
"acc_stderr": 0.016531170993278884,
"acc_norm": 0.4245810055865922,
"acc_norm_stderr": 0.016531170993278884
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.025829163272757482,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.025829163272757482
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7283950617283951,
"acc_stderr": 0.02474862449053737,
"acc_norm": 0.7283950617283951,
"acc_norm_stderr": 0.02474862449053737
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47392438070404175,
"acc_stderr": 0.012752858346533126,
"acc_norm": 0.47392438070404175,
"acc_norm_stderr": 0.012752858346533126
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.028332959514031208,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.028332959514031208
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.684640522875817,
"acc_stderr": 0.018798086284886887,
"acc_norm": 0.684640522875817,
"acc_norm_stderr": 0.018798086284886887
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.028666857790274648,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.028666857790274648
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6119951040391677,
"mc1_stderr": 0.01705876150134798,
"mc2": 0.7723177165257333,
"mc2_stderr": 0.013804607975615193
},
"harness|winogrande|5": {
"acc": 0.8310970797158642,
"acc_stderr": 0.010529981411838899
},
"harness|gsm8k|5": {
"acc": 0.7028051554207733,
"acc_stderr": 0.012588685966624174
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_FelixChao__Capricorn-7B-DPO | [
"region:us"
] | 2024-02-15T06:22:21+00:00 | {"pretty_name": "Evaluation run of FelixChao/Capricorn-7B-DPO", "dataset_summary": "Dataset automatically created during the evaluation run of model [FelixChao/Capricorn-7B-DPO](https://huggingface.co/FelixChao/Capricorn-7B-DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_FelixChao__Capricorn-7B-DPO\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-15T06:20:03.216862](https://huggingface.co/datasets/open-llm-leaderboard/details_FelixChao__Capricorn-7B-DPO/blob/main/results_2024-02-15T06-20-03.216862.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6488413542940337,\n \"acc_stderr\": 0.03217471597461982,\n \"acc_norm\": 0.64849979912089,\n \"acc_norm_stderr\": 0.03284318919319882,\n \"mc1\": 0.6119951040391677,\n \"mc1_stderr\": 0.01705876150134798,\n \"mc2\": 0.7723177165257333,\n \"mc2_stderr\": 0.013804607975615193\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7005119453924915,\n \"acc_stderr\": 0.01338502163731357,\n \"acc_norm\": 0.7286689419795221,\n \"acc_norm_stderr\": 0.012993807727545796\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6967735510854411,\n \"acc_stderr\": 0.004587128273935072,\n \"acc_norm\": 0.8846843258315077,\n \"acc_norm_stderr\": 0.0031874975090874164\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n \"acc_stderr\": 0.041539484047423976,\n \"acc_norm\": 0.6370370370370371,\n \"acc_norm_stderr\": 0.041539484047423976\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.02794321998933713,\n \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.02794321998933713\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7847222222222222,\n \"acc_stderr\": 0.03437079344106135,\n \"acc_norm\": 0.7847222222222222,\n \"acc_norm_stderr\": 0.03437079344106135\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6820809248554913,\n \"acc_stderr\": 0.0355068398916558,\n \"acc_norm\": 0.6820809248554913,\n \"acc_norm_stderr\": 0.0355068398916558\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107224,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107224\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.03246956919789958,\n \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.03246956919789958\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n \"acc_stderr\": 0.047028804320496165,\n \"acc_norm\": 0.5087719298245614,\n \"acc_norm_stderr\": 0.047028804320496165\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778405,\n \"acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778405\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n \"acc_stderr\": 0.04463112720677171,\n \"acc_norm\": 0.46825396825396826,\n \"acc_norm_stderr\": 0.04463112720677171\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7741935483870968,\n \"acc_stderr\": 0.023785577884181012,\n \"acc_norm\": 0.7741935483870968,\n \"acc_norm_stderr\": 0.023785577884181012\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.0315841532404771,\n \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.0315841532404771\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586818,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586818\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.676923076923077,\n \"acc_stderr\": 0.02371088850197057,\n \"acc_norm\": 0.676923076923077,\n \"acc_norm_stderr\": 0.02371088850197057\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32592592592592595,\n \"acc_stderr\": 0.02857834836547308,\n \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.02857834836547308\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6638655462184874,\n \"acc_stderr\": 0.030684737115135363,\n \"acc_norm\": 0.6638655462184874,\n \"acc_norm_stderr\": 0.030684737115135363\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8403669724770643,\n \"acc_stderr\": 0.01570349834846177,\n \"acc_norm\": 0.8403669724770643,\n \"acc_norm_stderr\": 0.01570349834846177\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8431372549019608,\n \"acc_stderr\": 0.025524722324553346,\n \"acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.025524722324553346\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8016877637130801,\n \"acc_stderr\": 0.025955020841621126,\n \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.025955020841621126\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.0349814938546247,\n \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.0349814938546247\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7520661157024794,\n \"acc_stderr\": 0.039418975265163025,\n \"acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.039418975265163025\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4017857142857143,\n \"acc_stderr\": 0.04653333146973646,\n \"acc_norm\": 0.4017857142857143,\n \"acc_norm_stderr\": 0.04653333146973646\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n \"acc_stderr\": 0.021901905115073325,\n \"acc_norm\": 0.8717948717948718,\n \"acc_norm_stderr\": 0.021901905115073325\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n \"acc_stderr\": 0.013625556907993464,\n \"acc_norm\": 0.8237547892720306,\n \"acc_norm_stderr\": 0.013625556907993464\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7138728323699421,\n \"acc_stderr\": 0.024332146779134128,\n \"acc_norm\": 0.7138728323699421,\n \"acc_norm_stderr\": 0.024332146779134128\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4245810055865922,\n \"acc_stderr\": 0.016531170993278884,\n \"acc_norm\": 0.4245810055865922,\n \"acc_norm_stderr\": 0.016531170993278884\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.025829163272757482,\n \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.025829163272757482\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7283950617283951,\n \"acc_stderr\": 0.02474862449053737,\n \"acc_norm\": 0.7283950617283951,\n \"acc_norm_stderr\": 0.02474862449053737\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47392438070404175,\n \"acc_stderr\": 0.012752858346533126,\n \"acc_norm\": 0.47392438070404175,\n \"acc_norm_stderr\": 0.012752858346533126\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.028332959514031208,\n \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.028332959514031208\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.684640522875817,\n \"acc_stderr\": 0.018798086284886887,\n \"acc_norm\": 0.684640522875817,\n \"acc_norm_stderr\": 0.018798086284886887\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.028666857790274648,\n \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.028666857790274648\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6119951040391677,\n \"mc1_stderr\": 0.01705876150134798,\n \"mc2\": 0.7723177165257333,\n \"mc2_stderr\": 0.013804607975615193\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8310970797158642,\n \"acc_stderr\": 0.010529981411838899\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7028051554207733,\n \"acc_stderr\": 0.012588685966624174\n }\n}\n```", "repo_url": "https://huggingface.co/FelixChao/Capricorn-7B-DPO", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_15T06_20_03.216862", "path": ["**/details_harness|arc:challenge|25_2024-02-15T06-20-03.216862.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-15T06-20-03.216862.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_15T06_20_03.216862", "path": ["**/details_harness|gsm8k|5_2024-02-15T06-20-03.216862.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-15T06-20-03.216862.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_15T06_20_03.216862", "path": ["**/details_harness|hellaswag|10_2024-02-15T06-20-03.216862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-15T06-20-03.216862.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_15T06_20_03.216862", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-15T06-20-03.216862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-15T06-20-03.216862.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-15T06-20-03.216862.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_15T06_20_03.216862", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T06-20-03.216862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T06-20-03.216862.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_15T06_20_03.216862", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-15T06-20-03.216862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-15T06-20-03.216862.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_15T06_20_03.216862", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-15T06-20-03.216862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-15T06-20-03.216862.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_15T06_20_03.216862", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T06-20-03.216862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T06-20-03.216862.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_15T06_20_03.216862", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T06-20-03.216862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T06-20-03.216862.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_15T06_20_03.216862", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-15T06-20-03.216862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-15T06-20-03.216862.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_15T06_20_03.216862", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T06-20-03.216862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T06-20-03.216862.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_15T06_20_03.216862", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T06-20-03.216862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T06-20-03.216862.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_15T06_20_03.216862", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T06-20-03.216862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T06-20-03.216862.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_15T06_20_03.216862", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T06-20-03.216862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T06-20-03.216862.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_15T06_20_03.216862", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-15T06-20-03.216862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-15T06-20-03.216862.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_15T06_20_03.216862", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-15T06-20-03.216862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-15T06-20-03.216862.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_15T06_20_03.216862", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T06-20-03.216862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T06-20-03.216862.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_15T06_20_03.216862", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-15T06-20-03.216862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-15T06-20-03.216862.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_15T06_20_03.216862", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T06-20-03.216862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T06-20-03.216862.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_15T06_20_03.216862", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T06-20-03.216862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T06-20-03.216862.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_15T06_20_03.216862", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T06-20-03.216862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T06-20-03.216862.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_15T06_20_03.216862", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-15T06-20-03.216862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-15T06-20-03.216862.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_15T06_20_03.216862", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T06-20-03.216862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T06-20-03.216862.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_15T06_20_03.216862", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T06-20-03.216862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T06-20-03.216862.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_15T06_20_03.216862", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T06-20-03.216862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T06-20-03.216862.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_15T06_20_03.216862", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T06-20-03.216862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T06-20-03.216862.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_15T06_20_03.216862", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T06-20-03.216862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T06-20-03.216862.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_15T06_20_03.216862", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T06-20-03.216862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T06-20-03.216862.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_15T06_20_03.216862", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T06-20-03.216862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T06-20-03.216862.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_15T06_20_03.216862", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T06-20-03.216862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T06-20-03.216862.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_15T06_20_03.216862", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T06-20-03.216862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T06-20-03.216862.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_15T06_20_03.216862", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T06-20-03.216862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T06-20-03.216862.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_15T06_20_03.216862", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T06-20-03.216862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T06-20-03.216862.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_15T06_20_03.216862", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T06-20-03.216862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T06-20-03.216862.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_15T06_20_03.216862", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T06-20-03.216862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T06-20-03.216862.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_15T06_20_03.216862", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T06-20-03.216862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T06-20-03.216862.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_15T06_20_03.216862", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-15T06-20-03.216862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-15T06-20-03.216862.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_15T06_20_03.216862", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T06-20-03.216862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T06-20-03.216862.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_15T06_20_03.216862", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-15T06-20-03.216862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-15T06-20-03.216862.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_15T06_20_03.216862", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T06-20-03.216862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T06-20-03.216862.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_15T06_20_03.216862", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T06-20-03.216862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T06-20-03.216862.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_15T06_20_03.216862", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T06-20-03.216862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T06-20-03.216862.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_15T06_20_03.216862", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-15T06-20-03.216862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-15T06-20-03.216862.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_15T06_20_03.216862", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-15T06-20-03.216862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-15T06-20-03.216862.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_15T06_20_03.216862", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T06-20-03.216862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T06-20-03.216862.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_15T06_20_03.216862", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T06-20-03.216862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T06-20-03.216862.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_15T06_20_03.216862", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T06-20-03.216862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T06-20-03.216862.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_15T06_20_03.216862", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T06-20-03.216862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T06-20-03.216862.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_15T06_20_03.216862", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-15T06-20-03.216862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-15T06-20-03.216862.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_15T06_20_03.216862", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-15T06-20-03.216862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-15T06-20-03.216862.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_15T06_20_03.216862", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-15T06-20-03.216862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-15T06-20-03.216862.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_15T06_20_03.216862", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T06-20-03.216862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T06-20-03.216862.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_15T06_20_03.216862", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-15T06-20-03.216862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-15T06-20-03.216862.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_15T06_20_03.216862", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T06-20-03.216862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T06-20-03.216862.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_15T06_20_03.216862", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T06-20-03.216862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T06-20-03.216862.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_15T06_20_03.216862", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-15T06-20-03.216862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-15T06-20-03.216862.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_15T06_20_03.216862", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-15T06-20-03.216862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-15T06-20-03.216862.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_15T06_20_03.216862", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-15T06-20-03.216862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-15T06-20-03.216862.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_15T06_20_03.216862", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T06-20-03.216862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T06-20-03.216862.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_15T06_20_03.216862", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-15T06-20-03.216862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-15T06-20-03.216862.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_15T06_20_03.216862", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-15T06-20-03.216862.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-15T06-20-03.216862.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_15T06_20_03.216862", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-15T06-20-03.216862.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-15T06-20-03.216862.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_15T06_20_03.216862", "path": ["**/details_harness|winogrande|5_2024-02-15T06-20-03.216862.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-15T06-20-03.216862.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_15T06_20_03.216862", "path": ["results_2024-02-15T06-20-03.216862.parquet"]}, {"split": "latest", "path": ["results_2024-02-15T06-20-03.216862.parquet"]}]}]} | 2024-02-15T06:22:43+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of FelixChao/Capricorn-7B-DPO
Dataset automatically created during the evaluation run of model FelixChao/Capricorn-7B-DPO on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-15T06:20:03.216862(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of FelixChao/Capricorn-7B-DPO\n\n\n\nDataset automatically created during the evaluation run of model FelixChao/Capricorn-7B-DPO on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-15T06:20:03.216862(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of FelixChao/Capricorn-7B-DPO\n\n\n\nDataset automatically created during the evaluation run of model FelixChao/Capricorn-7B-DPO on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-15T06:20:03.216862(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
2e362ceeb23cb992c2e81bacbf0342c7770fa721 |
# Dataset Card for Dataset Name
This dataset card provides an overview of the Research Phrases Dataset, designed for training and evaluating language models (LLMs) to generate contextually relevant phrases for various sections of research papers, particularly within the fields of biology and bioinformatics. The dataset includes structured inputs with metadata and prompts to guide the model in generating outputs tailored to the specific needs of academic writing.
### Dataset Description
The Research Phrases Dataset comprises thousands of phrases structured to assist in the generation of academic content across different sections of research papers. Each entry is designed with a conditional generation approach, incorporating metadata such as the field of study, keywords, and structured prompts. This method aims to enhance the model's ability to produce section-specific text, making it a valuable resource for automating parts of the research writing process.
## Uses
The Research Phrases Dataset is intended for direct use in training and evaluating language models geared towards academic writing assistance.
### Direct Use
It can be particularly useful in applications such as:
Automated Writing Tools: Supporting the development of tools that assist researchers in drafting various sections of their papers by providing contextually relevant phrases and sentences.
Educational Purposes: Aiding in the education of students and early-career researchers in the structuring and writing of academic papers by offering examples of how specific sections can be articulated.
Content Generation: Facilitating the generation of draft content for research papers, abstracts, and proposals, especially in the fields of biology and bioinformatics.
| yashm/phrases | [
"task_categories:text-generation",
"size_categories:1K<n<10K",
"language:en",
"license:cc-by-sa-4.0",
"region:us"
] | 2024-02-15T06:24:26+00:00 | {"language": ["en"], "license": "cc-by-sa-4.0", "size_categories": ["1K<n<10K"], "task_categories": ["text-generation"]} | 2024-02-15T06:43:10+00:00 | [] | [
"en"
] | TAGS
#task_categories-text-generation #size_categories-1K<n<10K #language-English #license-cc-by-sa-4.0 #region-us
|
# Dataset Card for Dataset Name
This dataset card provides an overview of the Research Phrases Dataset, designed for training and evaluating language models (LLMs) to generate contextually relevant phrases for various sections of research papers, particularly within the fields of biology and bioinformatics. The dataset includes structured inputs with metadata and prompts to guide the model in generating outputs tailored to the specific needs of academic writing.
### Dataset Description
The Research Phrases Dataset comprises thousands of phrases structured to assist in the generation of academic content across different sections of research papers. Each entry is designed with a conditional generation approach, incorporating metadata such as the field of study, keywords, and structured prompts. This method aims to enhance the model's ability to produce section-specific text, making it a valuable resource for automating parts of the research writing process.
## Uses
The Research Phrases Dataset is intended for direct use in training and evaluating language models geared towards academic writing assistance.
### Direct Use
It can be particularly useful in applications such as:
Automated Writing Tools: Supporting the development of tools that assist researchers in drafting various sections of their papers by providing contextually relevant phrases and sentences.
Educational Purposes: Aiding in the education of students and early-career researchers in the structuring and writing of academic papers by offering examples of how specific sections can be articulated.
Content Generation: Facilitating the generation of draft content for research papers, abstracts, and proposals, especially in the fields of biology and bioinformatics.
| [
"# Dataset Card for Dataset Name\n\nThis dataset card provides an overview of the Research Phrases Dataset, designed for training and evaluating language models (LLMs) to generate contextually relevant phrases for various sections of research papers, particularly within the fields of biology and bioinformatics. The dataset includes structured inputs with metadata and prompts to guide the model in generating outputs tailored to the specific needs of academic writing.",
"### Dataset Description\n\nThe Research Phrases Dataset comprises thousands of phrases structured to assist in the generation of academic content across different sections of research papers. Each entry is designed with a conditional generation approach, incorporating metadata such as the field of study, keywords, and structured prompts. This method aims to enhance the model's ability to produce section-specific text, making it a valuable resource for automating parts of the research writing process.",
"## Uses\n\nThe Research Phrases Dataset is intended for direct use in training and evaluating language models geared towards academic writing assistance.",
"### Direct Use\n\nIt can be particularly useful in applications such as:\n\nAutomated Writing Tools: Supporting the development of tools that assist researchers in drafting various sections of their papers by providing contextually relevant phrases and sentences.\nEducational Purposes: Aiding in the education of students and early-career researchers in the structuring and writing of academic papers by offering examples of how specific sections can be articulated.\nContent Generation: Facilitating the generation of draft content for research papers, abstracts, and proposals, especially in the fields of biology and bioinformatics."
] | [
"TAGS\n#task_categories-text-generation #size_categories-1K<n<10K #language-English #license-cc-by-sa-4.0 #region-us \n",
"# Dataset Card for Dataset Name\n\nThis dataset card provides an overview of the Research Phrases Dataset, designed for training and evaluating language models (LLMs) to generate contextually relevant phrases for various sections of research papers, particularly within the fields of biology and bioinformatics. The dataset includes structured inputs with metadata and prompts to guide the model in generating outputs tailored to the specific needs of academic writing.",
"### Dataset Description\n\nThe Research Phrases Dataset comprises thousands of phrases structured to assist in the generation of academic content across different sections of research papers. Each entry is designed with a conditional generation approach, incorporating metadata such as the field of study, keywords, and structured prompts. This method aims to enhance the model's ability to produce section-specific text, making it a valuable resource for automating parts of the research writing process.",
"## Uses\n\nThe Research Phrases Dataset is intended for direct use in training and evaluating language models geared towards academic writing assistance.",
"### Direct Use\n\nIt can be particularly useful in applications such as:\n\nAutomated Writing Tools: Supporting the development of tools that assist researchers in drafting various sections of their papers by providing contextually relevant phrases and sentences.\nEducational Purposes: Aiding in the education of students and early-career researchers in the structuring and writing of academic papers by offering examples of how specific sections can be articulated.\nContent Generation: Facilitating the generation of draft content for research papers, abstracts, and proposals, especially in the fields of biology and bioinformatics."
] |
020762ae4459074aefeefc3f203efad01334739a | # Dataset Card for "glaive-code-assist"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | chiennv/glaive-code-assist | [
"region:us"
] | 2024-02-15T06:51:47+00:00 | {"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "conversations", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "source", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 327973162, "num_examples": 182240}], "download_size": 160647616, "dataset_size": 327973162}} | 2024-02-15T06:51:59+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "glaive-code-assist"
More Information needed | [
"# Dataset Card for \"glaive-code-assist\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"glaive-code-assist\"\n\nMore Information needed"
] |
7225bb9c9b7352a1b183ed75f0385cef5adcb885 | # Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | shoyimobloqulov/text-to-speech-tts | [
"task_categories:text-classification",
"task_categories:text-generation",
"task_categories:text2text-generation",
"task_categories:translation",
"size_categories:n>1T",
"language:uz",
"license:mit",
"region:us"
] | 2024-02-15T06:53:21+00:00 | {"language": ["uz"], "license": "mit", "size_categories": ["n>1T"], "task_categories": ["text-classification", "text-generation", "text2text-generation", "translation"], "pretty_name": "shoyimobloqulov"} | 2024-02-15T07:00:25+00:00 | [] | [
"uz"
] | TAGS
#task_categories-text-classification #task_categories-text-generation #task_categories-text2text-generation #task_categories-translation #size_categories-n>1T #language-Uzbek #license-mit #region-us
| # Dataset Card for Dataset Name
This dataset card aims to be a base template for new datasets. It has been generated using this raw template.
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#task_categories-text-classification #task_categories-text-generation #task_categories-text2text-generation #task_categories-translation #size_categories-n>1T #language-Uzbek #license-mit #region-us \n",
"# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
bc827dd5d3b340c3b8612444fe68f2e91a6c4489 | # Luganda Sci-Math-Bio Translations
This dataset contains Luganda and English translations of biologicial, mathematical and scientific terms | allandclive/Luganda_Sci-Math-Bio_Translations | [
"task_categories:text2text-generation",
"task_categories:translation",
"size_categories:1K<n<10K",
"language:lg",
"language:en",
"license:cc-by-4.0",
"medical",
"biology",
"math",
"science",
"region:us"
] | 2024-02-15T07:14:24+00:00 | {"language": ["lg", "en"], "license": "cc-by-4.0", "size_categories": ["1K<n<10K"], "task_categories": ["text2text-generation", "translation"], "tags": ["medical", "biology", "math", "science"]} | 2024-02-15T07:27:15+00:00 | [] | [
"lg",
"en"
] | TAGS
#task_categories-text2text-generation #task_categories-translation #size_categories-1K<n<10K #language-Ganda #language-English #license-cc-by-4.0 #medical #biology #math #science #region-us
| # Luganda Sci-Math-Bio Translations
This dataset contains Luganda and English translations of biologicial, mathematical and scientific terms | [
"# Luganda Sci-Math-Bio Translations\n\nThis dataset contains Luganda and English translations of biologicial, mathematical and scientific terms"
] | [
"TAGS\n#task_categories-text2text-generation #task_categories-translation #size_categories-1K<n<10K #language-Ganda #language-English #license-cc-by-4.0 #medical #biology #math #science #region-us \n",
"# Luganda Sci-Math-Bio Translations\n\nThis dataset contains Luganda and English translations of biologicial, mathematical and scientific terms"
] |
3f4aa1ce7597cb63ceb36e370d77ae590c1f57f9 |
# Dataset Card for Evaluation run of Xenon1/Eclipse-13B-dpo
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Xenon1/Eclipse-13B-dpo](https://huggingface.co/Xenon1/Eclipse-13B-dpo) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Xenon1__Eclipse-13B-dpo",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-15T07:17:58.521396](https://huggingface.co/datasets/open-llm-leaderboard/details_Xenon1__Eclipse-13B-dpo/blob/main/results_2024-02-15T07-17-58.521396.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6518306406670916,
"acc_stderr": 0.03203563731649302,
"acc_norm": 0.6518870111709603,
"acc_norm_stderr": 0.03270907133110334,
"mc1": 0.3733170134638923,
"mc1_stderr": 0.01693237055757063,
"mc2": 0.5476200521634482,
"mc2_stderr": 0.015129504751265304
},
"harness|arc:challenge|25": {
"acc": 0.60580204778157,
"acc_stderr": 0.014280522667467325,
"acc_norm": 0.6459044368600683,
"acc_norm_stderr": 0.01397545412275656
},
"harness|hellaswag|10": {
"acc": 0.6506671977693687,
"acc_stderr": 0.00475784902341196,
"acc_norm": 0.8500298745269866,
"acc_norm_stderr": 0.0035631244274585173
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.720754716981132,
"acc_stderr": 0.027611163402399715,
"acc_norm": 0.720754716981132,
"acc_norm_stderr": 0.027611163402399715
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7361111111111112,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.7361111111111112,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.03533133389323657,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.03533133389323657
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.048580835742663454,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.048580835742663454
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5175438596491229,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.5175438596491229,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.593103448275862,
"acc_stderr": 0.04093793981266236,
"acc_norm": 0.593103448275862,
"acc_norm_stderr": 0.04093793981266236
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.025305906241590632,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.025305906241590632
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7677419354838709,
"acc_stderr": 0.024022256130308235,
"acc_norm": 0.7677419354838709,
"acc_norm_stderr": 0.024022256130308235
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.03374402644139403,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.03374402644139403
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.030088629490217487,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.030088629490217487
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768776,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768776
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.023946724741563976,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.023946724741563976
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.02911661760608301,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.02911661760608301
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7184873949579832,
"acc_stderr": 0.029213549414372174,
"acc_norm": 0.7184873949579832,
"acc_norm_stderr": 0.029213549414372174
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943343,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943343
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8403669724770643,
"acc_stderr": 0.015703498348461766,
"acc_norm": 0.8403669724770643,
"acc_norm_stderr": 0.015703498348461766
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.025845017986926917,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.025845017986926917
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.02595502084162113,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.02595502084162113
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.030769352008229136,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.030769352008229136
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.036412970813137296,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.036412970813137296
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8347107438016529,
"acc_stderr": 0.03390780612972776,
"acc_norm": 0.8347107438016529,
"acc_norm_stderr": 0.03390780612972776
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092375,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092375
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8237547892720306,
"acc_stderr": 0.013625556907993452,
"acc_norm": 0.8237547892720306,
"acc_norm_stderr": 0.013625556907993452
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7225433526011561,
"acc_stderr": 0.024105712607754307,
"acc_norm": 0.7225433526011561,
"acc_norm_stderr": 0.024105712607754307
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.39553072625698327,
"acc_stderr": 0.016353415410075775,
"acc_norm": 0.39553072625698327,
"acc_norm_stderr": 0.016353415410075775
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.02495418432487991,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.02495418432487991
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7234726688102894,
"acc_stderr": 0.025403832978179604,
"acc_norm": 0.7234726688102894,
"acc_norm_stderr": 0.025403832978179604
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7345679012345679,
"acc_stderr": 0.02456922360046085,
"acc_norm": 0.7345679012345679,
"acc_norm_stderr": 0.02456922360046085
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4621903520208605,
"acc_stderr": 0.012733671880342511,
"acc_norm": 0.4621903520208605,
"acc_norm_stderr": 0.012733671880342511
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6727941176470589,
"acc_stderr": 0.02850145286039656,
"acc_norm": 0.6727941176470589,
"acc_norm_stderr": 0.02850145286039656
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.01913994374848704,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.01913994374848704
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.689795918367347,
"acc_stderr": 0.029613459872484378,
"acc_norm": 0.689795918367347,
"acc_norm_stderr": 0.029613459872484378
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.024845753212306053,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.024845753212306053
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896309,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896309
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3733170134638923,
"mc1_stderr": 0.01693237055757063,
"mc2": 0.5476200521634482,
"mc2_stderr": 0.015129504751265304
},
"harness|winogrande|5": {
"acc": 0.846093133385951,
"acc_stderr": 0.010141944523750059
},
"harness|gsm8k|5": {
"acc": 0.6937073540561031,
"acc_stderr": 0.012696930106562915
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Xenon1__Eclipse-13B-dpo | [
"region:us"
] | 2024-02-15T07:20:14+00:00 | {"pretty_name": "Evaluation run of Xenon1/Eclipse-13B-dpo", "dataset_summary": "Dataset automatically created during the evaluation run of model [Xenon1/Eclipse-13B-dpo](https://huggingface.co/Xenon1/Eclipse-13B-dpo) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Xenon1__Eclipse-13B-dpo\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-15T07:17:58.521396](https://huggingface.co/datasets/open-llm-leaderboard/details_Xenon1__Eclipse-13B-dpo/blob/main/results_2024-02-15T07-17-58.521396.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6518306406670916,\n \"acc_stderr\": 0.03203563731649302,\n \"acc_norm\": 0.6518870111709603,\n \"acc_norm_stderr\": 0.03270907133110334,\n \"mc1\": 0.3733170134638923,\n \"mc1_stderr\": 0.01693237055757063,\n \"mc2\": 0.5476200521634482,\n \"mc2_stderr\": 0.015129504751265304\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.60580204778157,\n \"acc_stderr\": 0.014280522667467325,\n \"acc_norm\": 0.6459044368600683,\n \"acc_norm_stderr\": 0.01397545412275656\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6506671977693687,\n \"acc_stderr\": 0.00475784902341196,\n \"acc_norm\": 0.8500298745269866,\n \"acc_norm_stderr\": 0.0035631244274585173\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7361111111111112,\n \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.7361111111111112,\n \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6878612716763006,\n \"acc_stderr\": 0.03533133389323657,\n \"acc_norm\": 0.6878612716763006,\n \"acc_norm_stderr\": 0.03533133389323657\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663454,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663454\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5175438596491229,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.5175438596491229,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.593103448275862,\n \"acc_stderr\": 0.04093793981266236,\n \"acc_norm\": 0.593103448275862,\n \"acc_norm_stderr\": 0.04093793981266236\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4074074074074074,\n \"acc_stderr\": 0.025305906241590632,\n \"acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.025305906241590632\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7677419354838709,\n \"acc_stderr\": 0.024022256130308235,\n \"acc_norm\": 0.7677419354838709,\n \"acc_norm_stderr\": 0.024022256130308235\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.03374402644139403,\n \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.03374402644139403\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217487,\n \"acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217487\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768776,\n \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768776\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563976,\n \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563976\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.35185185185185186,\n \"acc_stderr\": 0.02911661760608301,\n \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.02911661760608301\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7184873949579832,\n \"acc_stderr\": 0.029213549414372174,\n \"acc_norm\": 0.7184873949579832,\n \"acc_norm_stderr\": 0.029213549414372174\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.304635761589404,\n \"acc_stderr\": 0.03757949922943343,\n \"acc_norm\": 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943343\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8403669724770643,\n \"acc_stderr\": 0.015703498348461766,\n \"acc_norm\": 0.8403669724770643,\n \"acc_norm_stderr\": 0.015703498348461766\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.025845017986926917,\n \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.025845017986926917\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8016877637130801,\n \"acc_stderr\": 0.02595502084162113,\n \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.02595502084162113\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n \"acc_stderr\": 0.030769352008229136,\n \"acc_norm\": 0.6995515695067265,\n \"acc_norm_stderr\": 0.030769352008229136\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.036412970813137296,\n \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.036412970813137296\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8347107438016529,\n \"acc_stderr\": 0.03390780612972776,\n \"acc_norm\": 0.8347107438016529,\n \"acc_norm_stderr\": 0.03390780612972776\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.020588491316092375,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.020588491316092375\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n \"acc_stderr\": 0.013625556907993452,\n \"acc_norm\": 0.8237547892720306,\n \"acc_norm_stderr\": 0.013625556907993452\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7225433526011561,\n \"acc_stderr\": 0.024105712607754307,\n \"acc_norm\": 0.7225433526011561,\n \"acc_norm_stderr\": 0.024105712607754307\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.39553072625698327,\n \"acc_stderr\": 0.016353415410075775,\n \"acc_norm\": 0.39553072625698327,\n \"acc_norm_stderr\": 0.016353415410075775\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7450980392156863,\n \"acc_stderr\": 0.02495418432487991,\n \"acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.02495418432487991\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7234726688102894,\n \"acc_stderr\": 0.025403832978179604,\n \"acc_norm\": 0.7234726688102894,\n \"acc_norm_stderr\": 0.025403832978179604\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.02456922360046085,\n \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.02456922360046085\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4621903520208605,\n \"acc_stderr\": 0.012733671880342511,\n \"acc_norm\": 0.4621903520208605,\n \"acc_norm_stderr\": 0.012733671880342511\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.02850145286039656,\n \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.02850145286039656\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.01913994374848704,\n \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.01913994374848704\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.689795918367347,\n \"acc_stderr\": 0.029613459872484378,\n \"acc_norm\": 0.689795918367347,\n \"acc_norm_stderr\": 0.029613459872484378\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n \"acc_stderr\": 0.024845753212306053,\n \"acc_norm\": 0.8557213930348259,\n \"acc_norm_stderr\": 0.024845753212306053\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896309,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896309\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3733170134638923,\n \"mc1_stderr\": 0.01693237055757063,\n \"mc2\": 0.5476200521634482,\n \"mc2_stderr\": 0.015129504751265304\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.846093133385951,\n \"acc_stderr\": 0.010141944523750059\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6937073540561031,\n \"acc_stderr\": 0.012696930106562915\n }\n}\n```", "repo_url": "https://huggingface.co/Xenon1/Eclipse-13B-dpo", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_15T07_17_58.521396", "path": ["**/details_harness|arc:challenge|25_2024-02-15T07-17-58.521396.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-15T07-17-58.521396.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_15T07_17_58.521396", "path": ["**/details_harness|gsm8k|5_2024-02-15T07-17-58.521396.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-15T07-17-58.521396.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_15T07_17_58.521396", "path": ["**/details_harness|hellaswag|10_2024-02-15T07-17-58.521396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-15T07-17-58.521396.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_15T07_17_58.521396", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-15T07-17-58.521396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-15T07-17-58.521396.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-15T07-17-58.521396.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_15T07_17_58.521396", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T07-17-58.521396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T07-17-58.521396.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_15T07_17_58.521396", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-15T07-17-58.521396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-15T07-17-58.521396.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_15T07_17_58.521396", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-15T07-17-58.521396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-15T07-17-58.521396.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_15T07_17_58.521396", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T07-17-58.521396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T07-17-58.521396.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_15T07_17_58.521396", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T07-17-58.521396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T07-17-58.521396.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_15T07_17_58.521396", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-15T07-17-58.521396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-15T07-17-58.521396.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_15T07_17_58.521396", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T07-17-58.521396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T07-17-58.521396.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_15T07_17_58.521396", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T07-17-58.521396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T07-17-58.521396.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_15T07_17_58.521396", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T07-17-58.521396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T07-17-58.521396.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_15T07_17_58.521396", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T07-17-58.521396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T07-17-58.521396.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_15T07_17_58.521396", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-15T07-17-58.521396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-15T07-17-58.521396.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_15T07_17_58.521396", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-15T07-17-58.521396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-15T07-17-58.521396.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_15T07_17_58.521396", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T07-17-58.521396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T07-17-58.521396.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_15T07_17_58.521396", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-15T07-17-58.521396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-15T07-17-58.521396.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_15T07_17_58.521396", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T07-17-58.521396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T07-17-58.521396.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_15T07_17_58.521396", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T07-17-58.521396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T07-17-58.521396.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_15T07_17_58.521396", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T07-17-58.521396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T07-17-58.521396.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_15T07_17_58.521396", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-15T07-17-58.521396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-15T07-17-58.521396.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_15T07_17_58.521396", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T07-17-58.521396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T07-17-58.521396.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_15T07_17_58.521396", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T07-17-58.521396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T07-17-58.521396.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_15T07_17_58.521396", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T07-17-58.521396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T07-17-58.521396.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_15T07_17_58.521396", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T07-17-58.521396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T07-17-58.521396.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_15T07_17_58.521396", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T07-17-58.521396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T07-17-58.521396.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_15T07_17_58.521396", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T07-17-58.521396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T07-17-58.521396.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_15T07_17_58.521396", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T07-17-58.521396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T07-17-58.521396.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_15T07_17_58.521396", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T07-17-58.521396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T07-17-58.521396.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_15T07_17_58.521396", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T07-17-58.521396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T07-17-58.521396.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_15T07_17_58.521396", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T07-17-58.521396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T07-17-58.521396.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_15T07_17_58.521396", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T07-17-58.521396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T07-17-58.521396.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_15T07_17_58.521396", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T07-17-58.521396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T07-17-58.521396.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_15T07_17_58.521396", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T07-17-58.521396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T07-17-58.521396.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_15T07_17_58.521396", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T07-17-58.521396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T07-17-58.521396.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_15T07_17_58.521396", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-15T07-17-58.521396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-15T07-17-58.521396.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_15T07_17_58.521396", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T07-17-58.521396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T07-17-58.521396.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_15T07_17_58.521396", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-15T07-17-58.521396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-15T07-17-58.521396.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_15T07_17_58.521396", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T07-17-58.521396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T07-17-58.521396.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_15T07_17_58.521396", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T07-17-58.521396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T07-17-58.521396.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_15T07_17_58.521396", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T07-17-58.521396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T07-17-58.521396.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_15T07_17_58.521396", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-15T07-17-58.521396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-15T07-17-58.521396.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_15T07_17_58.521396", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-15T07-17-58.521396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-15T07-17-58.521396.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_15T07_17_58.521396", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T07-17-58.521396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T07-17-58.521396.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_15T07_17_58.521396", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T07-17-58.521396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T07-17-58.521396.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_15T07_17_58.521396", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T07-17-58.521396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T07-17-58.521396.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_15T07_17_58.521396", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T07-17-58.521396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T07-17-58.521396.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_15T07_17_58.521396", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-15T07-17-58.521396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-15T07-17-58.521396.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_15T07_17_58.521396", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-15T07-17-58.521396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-15T07-17-58.521396.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_15T07_17_58.521396", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-15T07-17-58.521396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-15T07-17-58.521396.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_15T07_17_58.521396", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T07-17-58.521396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T07-17-58.521396.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_15T07_17_58.521396", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-15T07-17-58.521396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-15T07-17-58.521396.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_15T07_17_58.521396", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T07-17-58.521396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T07-17-58.521396.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_15T07_17_58.521396", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T07-17-58.521396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T07-17-58.521396.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_15T07_17_58.521396", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-15T07-17-58.521396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-15T07-17-58.521396.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_15T07_17_58.521396", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-15T07-17-58.521396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-15T07-17-58.521396.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_15T07_17_58.521396", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-15T07-17-58.521396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-15T07-17-58.521396.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_15T07_17_58.521396", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T07-17-58.521396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T07-17-58.521396.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_15T07_17_58.521396", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-15T07-17-58.521396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-15T07-17-58.521396.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_15T07_17_58.521396", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-15T07-17-58.521396.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-15T07-17-58.521396.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_15T07_17_58.521396", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-15T07-17-58.521396.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-15T07-17-58.521396.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_15T07_17_58.521396", "path": ["**/details_harness|winogrande|5_2024-02-15T07-17-58.521396.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-15T07-17-58.521396.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_15T07_17_58.521396", "path": ["results_2024-02-15T07-17-58.521396.parquet"]}, {"split": "latest", "path": ["results_2024-02-15T07-17-58.521396.parquet"]}]}]} | 2024-02-15T07:20:58+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Xenon1/Eclipse-13B-dpo
Dataset automatically created during the evaluation run of model Xenon1/Eclipse-13B-dpo on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-15T07:17:58.521396(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Xenon1/Eclipse-13B-dpo\n\n\n\nDataset automatically created during the evaluation run of model Xenon1/Eclipse-13B-dpo on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-15T07:17:58.521396(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Xenon1/Eclipse-13B-dpo\n\n\n\nDataset automatically created during the evaluation run of model Xenon1/Eclipse-13B-dpo on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-15T07:17:58.521396(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
122a0bc6ad8a7bfc2bc647d131905cd672addfc7 |
# Ukrainian News Summarization Dataset v1.1
# Based on [shamotskyi/ukr_pravda_2y](https://huggingface.co/datasets/shamotskyi/ukr_pravda_2y) News Dataset
This dataset contains news articles from the Ukrainian news website pravda.com.ua, summarized using the Gemini Pro model. The dataset is designed to support research in Ukrainian text summarization, news headline generation, and other NLP tasks.
## Dataset Structure
The dataset is structured as a CSV file with the following columns:
* **text:** The full text of the news article.
* **summary:** The Gemini Pro generated summary of the news article via Gemini API
## Usage Examples
**Fine-tuning Summarization Models:**
```python
from datasets import load_dataset
dataset = load_dataset("d0p3/ukr-pravda-news-summary")
# Fine-tune your summarization model on the 'original_text' and 'summary' columns
```
**Evaluating Summarization Quality:**
```python
from rouge import Rouge # Install the ROUGE metric library
rouge = Rouge()
scores = rouge.get_scores(model_generated_summaries, dataset["summary"])
```
## Creation Process
1. **Web Scraping:** [shamotskyi/ukr_pravda_2y](https://huggingface.co/datasets/shamotskyi/ukr_pravda_2y) dataset was used as a base.
2. **Summarization:** Each article's `ukr_text` was summarized using the Gemini Pro model via Gemini API.
3. **Dataset Formatting:** The data was compiled into a CSV format.
## Licensing
This dataset is released under the [CC-BY-NC-4.0]. The rights to the original pravda.com.ua news articles remain with their respective authors.
## Ethical Considerations
* News article summarization comes with its own ethical concerns. Ensure this dataset is not used to generate misleading or deceptive content.
* Always consider the potential biases and limitations of Gemini Pro as a summarization model.
## Contributors
* [d0p3]
## Expanding the Dataset
We welcome contributions! If you'd like to expand the dataset by adding more articles or summaries from other Ukrainian news sources! | d0p3/ukr-pravda-news-summary-v1.1 | [
"task_categories:summarization",
"size_categories:10K<n<100K",
"language:uk",
"license:cc-by-nc-4.0",
"region:us"
] | 2024-02-15T07:29:27+00:00 | {"language": ["uk"], "license": "cc-by-nc-4.0", "size_categories": ["10K<n<100K"], "task_categories": ["summarization"], "pretty_name": "Ukr Pravda News Summarized v1.1"} | 2024-02-15T07:46:01+00:00 | [] | [
"uk"
] | TAGS
#task_categories-summarization #size_categories-10K<n<100K #language-Ukrainian #license-cc-by-nc-4.0 #region-us
|
# Ukrainian News Summarization Dataset v1.1
# Based on shamotskyi/ukr_pravda_2y News Dataset
This dataset contains news articles from the Ukrainian news website URL, summarized using the Gemini Pro model. The dataset is designed to support research in Ukrainian text summarization, news headline generation, and other NLP tasks.
## Dataset Structure
The dataset is structured as a CSV file with the following columns:
* text: The full text of the news article.
* summary: The Gemini Pro generated summary of the news article via Gemini API
## Usage Examples
Fine-tuning Summarization Models:
Evaluating Summarization Quality:
## Creation Process
1. Web Scraping: shamotskyi/ukr_pravda_2y dataset was used as a base.
2. Summarization: Each article's 'ukr_text' was summarized using the Gemini Pro model via Gemini API.
3. Dataset Formatting: The data was compiled into a CSV format.
## Licensing
This dataset is released under the [CC-BY-NC-4.0]. The rights to the original URL news articles remain with their respective authors.
## Ethical Considerations
* News article summarization comes with its own ethical concerns. Ensure this dataset is not used to generate misleading or deceptive content.
* Always consider the potential biases and limitations of Gemini Pro as a summarization model.
## Contributors
* [d0p3]
## Expanding the Dataset
We welcome contributions! If you'd like to expand the dataset by adding more articles or summaries from other Ukrainian news sources! | [
"# Ukrainian News Summarization Dataset v1.1",
"# Based on shamotskyi/ukr_pravda_2y News Dataset\n\nThis dataset contains news articles from the Ukrainian news website URL, summarized using the Gemini Pro model. The dataset is designed to support research in Ukrainian text summarization, news headline generation, and other NLP tasks.",
"## Dataset Structure\n\nThe dataset is structured as a CSV file with the following columns:\n\n* text: The full text of the news article.\n* summary: The Gemini Pro generated summary of the news article via Gemini API",
"## Usage Examples\n\nFine-tuning Summarization Models:\n\n\n\nEvaluating Summarization Quality:",
"## Creation Process\n\n1. Web Scraping: shamotskyi/ukr_pravda_2y dataset was used as a base.\n2. Summarization: Each article's 'ukr_text' was summarized using the Gemini Pro model via Gemini API.\n3. Dataset Formatting: The data was compiled into a CSV format.",
"## Licensing\n\nThis dataset is released under the [CC-BY-NC-4.0]. The rights to the original URL news articles remain with their respective authors.",
"## Ethical Considerations\n\n* News article summarization comes with its own ethical concerns. Ensure this dataset is not used to generate misleading or deceptive content.\n* Always consider the potential biases and limitations of Gemini Pro as a summarization model.",
"## Contributors\n\n* [d0p3]",
"## Expanding the Dataset\n\nWe welcome contributions! If you'd like to expand the dataset by adding more articles or summaries from other Ukrainian news sources!"
] | [
"TAGS\n#task_categories-summarization #size_categories-10K<n<100K #language-Ukrainian #license-cc-by-nc-4.0 #region-us \n",
"# Ukrainian News Summarization Dataset v1.1",
"# Based on shamotskyi/ukr_pravda_2y News Dataset\n\nThis dataset contains news articles from the Ukrainian news website URL, summarized using the Gemini Pro model. The dataset is designed to support research in Ukrainian text summarization, news headline generation, and other NLP tasks.",
"## Dataset Structure\n\nThe dataset is structured as a CSV file with the following columns:\n\n* text: The full text of the news article.\n* summary: The Gemini Pro generated summary of the news article via Gemini API",
"## Usage Examples\n\nFine-tuning Summarization Models:\n\n\n\nEvaluating Summarization Quality:",
"## Creation Process\n\n1. Web Scraping: shamotskyi/ukr_pravda_2y dataset was used as a base.\n2. Summarization: Each article's 'ukr_text' was summarized using the Gemini Pro model via Gemini API.\n3. Dataset Formatting: The data was compiled into a CSV format.",
"## Licensing\n\nThis dataset is released under the [CC-BY-NC-4.0]. The rights to the original URL news articles remain with their respective authors.",
"## Ethical Considerations\n\n* News article summarization comes with its own ethical concerns. Ensure this dataset is not used to generate misleading or deceptive content.\n* Always consider the potential biases and limitations of Gemini Pro as a summarization model.",
"## Contributors\n\n* [d0p3]",
"## Expanding the Dataset\n\nWe welcome contributions! If you'd like to expand the dataset by adding more articles or summaries from other Ukrainian news sources!"
] |
af832e4d7a986d171ec7a3dcc7c4cf303e8d4716 |
# Dataset Card for Evaluation run of aloobun/Reyna-Mini-1.8B-v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [aloobun/Reyna-Mini-1.8B-v0.1](https://huggingface.co/aloobun/Reyna-Mini-1.8B-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_aloobun__Reyna-Mini-1.8B-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-15T07:29:36.560907](https://huggingface.co/datasets/open-llm-leaderboard/details_aloobun__Reyna-Mini-1.8B-v0.1/blob/main/results_2024-02-15T07-29-36.560907.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.44766652106081417,
"acc_stderr": 0.03438060993883449,
"acc_norm": 0.4545350911182196,
"acc_norm_stderr": 0.03520914160548039,
"mc1": 0.26560587515299877,
"mc1_stderr": 0.015461027627253595,
"mc2": 0.4140207828143034,
"mc2_stderr": 0.014035709599911956
},
"harness|arc:challenge|25": {
"acc": 0.33361774744027306,
"acc_stderr": 0.013778687054176546,
"acc_norm": 0.35238907849829354,
"acc_norm_stderr": 0.013960142600598675
},
"harness|hellaswag|10": {
"acc": 0.44991037641904,
"acc_stderr": 0.004964679845918436,
"acc_norm": 0.6041625174268074,
"acc_norm_stderr": 0.004880303863138508
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3925925925925926,
"acc_stderr": 0.04218506215368879,
"acc_norm": 0.3925925925925926,
"acc_norm_stderr": 0.04218506215368879
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4342105263157895,
"acc_stderr": 0.040335656678483184,
"acc_norm": 0.4342105263157895,
"acc_norm_stderr": 0.040335656678483184
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5018867924528302,
"acc_stderr": 0.030772653642075664,
"acc_norm": 0.5018867924528302,
"acc_norm_stderr": 0.030772653642075664
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4305555555555556,
"acc_stderr": 0.04140685639111502,
"acc_norm": 0.4305555555555556,
"acc_norm_stderr": 0.04140685639111502
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3988439306358382,
"acc_stderr": 0.037336266553835096,
"acc_norm": 0.3988439306358382,
"acc_norm_stderr": 0.037336266553835096
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.043364327079931785,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.043364327079931785
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4,
"acc_stderr": 0.03202563076101737,
"acc_norm": 0.4,
"acc_norm_stderr": 0.03202563076101737
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159393,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159393
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4896551724137931,
"acc_stderr": 0.04165774775728763,
"acc_norm": 0.4896551724137931,
"acc_norm_stderr": 0.04165774775728763
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.024870815251057096,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.024870815251057096
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.03718489006818115,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.03718489006818115
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.4774193548387097,
"acc_stderr": 0.02841498501970786,
"acc_norm": 0.4774193548387097,
"acc_norm_stderr": 0.02841498501970786
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3645320197044335,
"acc_stderr": 0.0338640574606209,
"acc_norm": 0.3645320197044335,
"acc_norm_stderr": 0.0338640574606209
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.037937131711656344,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.037937131711656344
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5808080808080808,
"acc_stderr": 0.03515520728670417,
"acc_norm": 0.5808080808080808,
"acc_norm_stderr": 0.03515520728670417
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.5440414507772021,
"acc_stderr": 0.035944137112724366,
"acc_norm": 0.5440414507772021,
"acc_norm_stderr": 0.035944137112724366
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.358974358974359,
"acc_stderr": 0.024321738484602354,
"acc_norm": 0.358974358974359,
"acc_norm_stderr": 0.024321738484602354
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.028317533496066468,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.028317533496066468
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.41596638655462187,
"acc_stderr": 0.03201650100739615,
"acc_norm": 0.41596638655462187,
"acc_norm_stderr": 0.03201650100739615
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2251655629139073,
"acc_stderr": 0.03410435282008936,
"acc_norm": 0.2251655629139073,
"acc_norm_stderr": 0.03410435282008936
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5559633027522936,
"acc_stderr": 0.021302621211654518,
"acc_norm": 0.5559633027522936,
"acc_norm_stderr": 0.021302621211654518
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.27314814814814814,
"acc_stderr": 0.03038805130167812,
"acc_norm": 0.27314814814814814,
"acc_norm_stderr": 0.03038805130167812
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.44607843137254904,
"acc_stderr": 0.03488845451304974,
"acc_norm": 0.44607843137254904,
"acc_norm_stderr": 0.03488845451304974
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5949367088607594,
"acc_stderr": 0.03195514741370671,
"acc_norm": 0.5949367088607594,
"acc_norm_stderr": 0.03195514741370671
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5201793721973094,
"acc_stderr": 0.033530461674123,
"acc_norm": 0.5201793721973094,
"acc_norm_stderr": 0.033530461674123
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5572519083969466,
"acc_stderr": 0.043564472026650695,
"acc_norm": 0.5572519083969466,
"acc_norm_stderr": 0.043564472026650695
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7024793388429752,
"acc_stderr": 0.04173349148083499,
"acc_norm": 0.7024793388429752,
"acc_norm_stderr": 0.04173349148083499
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5,
"acc_stderr": 0.04833682445228318,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04833682445228318
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.44785276073619634,
"acc_stderr": 0.039069474794566024,
"acc_norm": 0.44785276073619634,
"acc_norm_stderr": 0.039069474794566024
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.6310679611650486,
"acc_stderr": 0.0477761518115674,
"acc_norm": 0.6310679611650486,
"acc_norm_stderr": 0.0477761518115674
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7350427350427351,
"acc_stderr": 0.028911208802749472,
"acc_norm": 0.7350427350427351,
"acc_norm_stderr": 0.028911208802749472
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6015325670498084,
"acc_stderr": 0.01750743860277741,
"acc_norm": 0.6015325670498084,
"acc_norm_stderr": 0.01750743860277741
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5289017341040463,
"acc_stderr": 0.026874085883518348,
"acc_norm": 0.5289017341040463,
"acc_norm_stderr": 0.026874085883518348
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.264804469273743,
"acc_stderr": 0.014756906483260659,
"acc_norm": 0.264804469273743,
"acc_norm_stderr": 0.014756906483260659
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5849673202614379,
"acc_stderr": 0.0282135041778241,
"acc_norm": 0.5849673202614379,
"acc_norm_stderr": 0.0282135041778241
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.43729903536977494,
"acc_stderr": 0.028173917761762885,
"acc_norm": 0.43729903536977494,
"acc_norm_stderr": 0.028173917761762885
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4567901234567901,
"acc_stderr": 0.02771666165019404,
"acc_norm": 0.4567901234567901,
"acc_norm_stderr": 0.02771666165019404
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.36524822695035464,
"acc_stderr": 0.02872386385328128,
"acc_norm": 0.36524822695035464,
"acc_norm_stderr": 0.02872386385328128
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.35528031290743156,
"acc_stderr": 0.01222362336404404,
"acc_norm": 0.35528031290743156,
"acc_norm_stderr": 0.01222362336404404
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3272058823529412,
"acc_stderr": 0.02850145286039655,
"acc_norm": 0.3272058823529412,
"acc_norm_stderr": 0.02850145286039655
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.43300653594771243,
"acc_stderr": 0.020045442473324227,
"acc_norm": 0.43300653594771243,
"acc_norm_stderr": 0.020045442473324227
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5545454545454546,
"acc_stderr": 0.047605488214603246,
"acc_norm": 0.5545454545454546,
"acc_norm_stderr": 0.047605488214603246
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4326530612244898,
"acc_stderr": 0.03171752824062664,
"acc_norm": 0.4326530612244898,
"acc_norm_stderr": 0.03171752824062664
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5870646766169154,
"acc_stderr": 0.03481520803367348,
"acc_norm": 0.5870646766169154,
"acc_norm_stderr": 0.03481520803367348
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-virology|5": {
"acc": 0.39759036144578314,
"acc_stderr": 0.03809973084540218,
"acc_norm": 0.39759036144578314,
"acc_norm_stderr": 0.03809973084540218
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5380116959064327,
"acc_stderr": 0.03823727092882307,
"acc_norm": 0.5380116959064327,
"acc_norm_stderr": 0.03823727092882307
},
"harness|truthfulqa:mc|0": {
"mc1": 0.26560587515299877,
"mc1_stderr": 0.015461027627253595,
"mc2": 0.4140207828143034,
"mc2_stderr": 0.014035709599911956
},
"harness|winogrande|5": {
"acc": 0.6085240726124704,
"acc_stderr": 0.013717487071290856
},
"harness|gsm8k|5": {
"acc": 0.05458680818802123,
"acc_stderr": 0.006257444037912527
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_aloobun__Reyna-Mini-1.8B-v0.1 | [
"region:us"
] | 2024-02-15T07:31:44+00:00 | {"pretty_name": "Evaluation run of aloobun/Reyna-Mini-1.8B-v0.1", "dataset_summary": "Dataset automatically created during the evaluation run of model [aloobun/Reyna-Mini-1.8B-v0.1](https://huggingface.co/aloobun/Reyna-Mini-1.8B-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_aloobun__Reyna-Mini-1.8B-v0.1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-15T07:29:36.560907](https://huggingface.co/datasets/open-llm-leaderboard/details_aloobun__Reyna-Mini-1.8B-v0.1/blob/main/results_2024-02-15T07-29-36.560907.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.44766652106081417,\n \"acc_stderr\": 0.03438060993883449,\n \"acc_norm\": 0.4545350911182196,\n \"acc_norm_stderr\": 0.03520914160548039,\n \"mc1\": 0.26560587515299877,\n \"mc1_stderr\": 0.015461027627253595,\n \"mc2\": 0.4140207828143034,\n \"mc2_stderr\": 0.014035709599911956\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.33361774744027306,\n \"acc_stderr\": 0.013778687054176546,\n \"acc_norm\": 0.35238907849829354,\n \"acc_norm_stderr\": 0.013960142600598675\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.44991037641904,\n \"acc_stderr\": 0.004964679845918436,\n \"acc_norm\": 0.6041625174268074,\n \"acc_norm_stderr\": 0.004880303863138508\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3925925925925926,\n \"acc_stderr\": 0.04218506215368879,\n \"acc_norm\": 0.3925925925925926,\n \"acc_norm_stderr\": 0.04218506215368879\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.4342105263157895,\n \"acc_stderr\": 0.040335656678483184,\n \"acc_norm\": 0.4342105263157895,\n \"acc_norm_stderr\": 0.040335656678483184\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5018867924528302,\n \"acc_stderr\": 0.030772653642075664,\n \"acc_norm\": 0.5018867924528302,\n \"acc_norm_stderr\": 0.030772653642075664\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4305555555555556,\n \"acc_stderr\": 0.04140685639111502,\n \"acc_norm\": 0.4305555555555556,\n \"acc_norm_stderr\": 0.04140685639111502\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3988439306358382,\n \"acc_stderr\": 0.037336266553835096,\n \"acc_norm\": 0.3988439306358382,\n \"acc_norm_stderr\": 0.037336266553835096\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.043364327079931785,\n \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.043364327079931785\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.03202563076101737,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.03202563076101737\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n \"acc_stderr\": 0.04266339443159393,\n \"acc_norm\": 0.2894736842105263,\n \"acc_norm_stderr\": 0.04266339443159393\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.4896551724137931,\n \"acc_stderr\": 0.04165774775728763,\n \"acc_norm\": 0.4896551724137931,\n \"acc_norm_stderr\": 0.04165774775728763\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.37037037037037035,\n \"acc_stderr\": 0.024870815251057096,\n \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.024870815251057096\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.03718489006818115,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.03718489006818115\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.4774193548387097,\n \"acc_stderr\": 0.02841498501970786,\n \"acc_norm\": 0.4774193548387097,\n \"acc_norm_stderr\": 0.02841498501970786\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3645320197044335,\n \"acc_stderr\": 0.0338640574606209,\n \"acc_norm\": 0.3645320197044335,\n \"acc_norm_stderr\": 0.0338640574606209\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6181818181818182,\n \"acc_stderr\": 0.037937131711656344,\n \"acc_norm\": 0.6181818181818182,\n \"acc_norm_stderr\": 0.037937131711656344\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.5808080808080808,\n \"acc_stderr\": 0.03515520728670417,\n \"acc_norm\": 0.5808080808080808,\n \"acc_norm_stderr\": 0.03515520728670417\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.5440414507772021,\n \"acc_stderr\": 0.035944137112724366,\n \"acc_norm\": 0.5440414507772021,\n \"acc_norm_stderr\": 0.035944137112724366\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.358974358974359,\n \"acc_stderr\": 0.024321738484602354,\n \"acc_norm\": 0.358974358974359,\n \"acc_norm_stderr\": 0.024321738484602354\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3148148148148148,\n \"acc_stderr\": 0.028317533496066468,\n \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.028317533496066468\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.41596638655462187,\n \"acc_stderr\": 0.03201650100739615,\n \"acc_norm\": 0.41596638655462187,\n \"acc_norm_stderr\": 0.03201650100739615\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2251655629139073,\n \"acc_stderr\": 0.03410435282008936,\n \"acc_norm\": 0.2251655629139073,\n \"acc_norm_stderr\": 0.03410435282008936\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.5559633027522936,\n \"acc_stderr\": 0.021302621211654518,\n \"acc_norm\": 0.5559633027522936,\n \"acc_norm_stderr\": 0.021302621211654518\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.27314814814814814,\n \"acc_stderr\": 0.03038805130167812,\n \"acc_norm\": 0.27314814814814814,\n \"acc_norm_stderr\": 0.03038805130167812\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.44607843137254904,\n \"acc_stderr\": 0.03488845451304974,\n \"acc_norm\": 0.44607843137254904,\n \"acc_norm_stderr\": 0.03488845451304974\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.5949367088607594,\n \"acc_stderr\": 0.03195514741370671,\n \"acc_norm\": 0.5949367088607594,\n \"acc_norm_stderr\": 0.03195514741370671\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5201793721973094,\n \"acc_stderr\": 0.033530461674123,\n \"acc_norm\": 0.5201793721973094,\n \"acc_norm_stderr\": 0.033530461674123\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5572519083969466,\n \"acc_stderr\": 0.043564472026650695,\n \"acc_norm\": 0.5572519083969466,\n \"acc_norm_stderr\": 0.043564472026650695\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7024793388429752,\n \"acc_stderr\": 0.04173349148083499,\n \"acc_norm\": 0.7024793388429752,\n \"acc_norm_stderr\": 0.04173349148083499\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04833682445228318,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04833682445228318\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.44785276073619634,\n \"acc_stderr\": 0.039069474794566024,\n \"acc_norm\": 0.44785276073619634,\n \"acc_norm_stderr\": 0.039069474794566024\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6310679611650486,\n \"acc_stderr\": 0.0477761518115674,\n \"acc_norm\": 0.6310679611650486,\n \"acc_norm_stderr\": 0.0477761518115674\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7350427350427351,\n \"acc_stderr\": 0.028911208802749472,\n \"acc_norm\": 0.7350427350427351,\n \"acc_norm_stderr\": 0.028911208802749472\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6015325670498084,\n \"acc_stderr\": 0.01750743860277741,\n \"acc_norm\": 0.6015325670498084,\n \"acc_norm_stderr\": 0.01750743860277741\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5289017341040463,\n \"acc_stderr\": 0.026874085883518348,\n \"acc_norm\": 0.5289017341040463,\n \"acc_norm_stderr\": 0.026874085883518348\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.264804469273743,\n \"acc_stderr\": 0.014756906483260659,\n \"acc_norm\": 0.264804469273743,\n \"acc_norm_stderr\": 0.014756906483260659\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5849673202614379,\n \"acc_stderr\": 0.0282135041778241,\n \"acc_norm\": 0.5849673202614379,\n \"acc_norm_stderr\": 0.0282135041778241\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.43729903536977494,\n \"acc_stderr\": 0.028173917761762885,\n \"acc_norm\": 0.43729903536977494,\n \"acc_norm_stderr\": 0.028173917761762885\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.4567901234567901,\n \"acc_stderr\": 0.02771666165019404,\n \"acc_norm\": 0.4567901234567901,\n \"acc_norm_stderr\": 0.02771666165019404\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.36524822695035464,\n \"acc_stderr\": 0.02872386385328128,\n \"acc_norm\": 0.36524822695035464,\n \"acc_norm_stderr\": 0.02872386385328128\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.35528031290743156,\n \"acc_stderr\": 0.01222362336404404,\n \"acc_norm\": 0.35528031290743156,\n \"acc_norm_stderr\": 0.01222362336404404\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.3272058823529412,\n \"acc_stderr\": 0.02850145286039655,\n \"acc_norm\": 0.3272058823529412,\n \"acc_norm_stderr\": 0.02850145286039655\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.43300653594771243,\n \"acc_stderr\": 0.020045442473324227,\n \"acc_norm\": 0.43300653594771243,\n \"acc_norm_stderr\": 0.020045442473324227\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5545454545454546,\n \"acc_stderr\": 0.047605488214603246,\n \"acc_norm\": 0.5545454545454546,\n \"acc_norm_stderr\": 0.047605488214603246\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.4326530612244898,\n \"acc_stderr\": 0.03171752824062664,\n \"acc_norm\": 0.4326530612244898,\n \"acc_norm_stderr\": 0.03171752824062664\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5870646766169154,\n \"acc_stderr\": 0.03481520803367348,\n \"acc_norm\": 0.5870646766169154,\n \"acc_norm_stderr\": 0.03481520803367348\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.39759036144578314,\n \"acc_stderr\": 0.03809973084540218,\n \"acc_norm\": 0.39759036144578314,\n \"acc_norm_stderr\": 0.03809973084540218\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.5380116959064327,\n \"acc_stderr\": 0.03823727092882307,\n \"acc_norm\": 0.5380116959064327,\n \"acc_norm_stderr\": 0.03823727092882307\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.26560587515299877,\n \"mc1_stderr\": 0.015461027627253595,\n \"mc2\": 0.4140207828143034,\n \"mc2_stderr\": 0.014035709599911956\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6085240726124704,\n \"acc_stderr\": 0.013717487071290856\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.05458680818802123,\n \"acc_stderr\": 0.006257444037912527\n }\n}\n```", "repo_url": "https://huggingface.co/aloobun/Reyna-Mini-1.8B-v0.1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_15T07_29_36.560907", "path": ["**/details_harness|arc:challenge|25_2024-02-15T07-29-36.560907.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-15T07-29-36.560907.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_15T07_29_36.560907", "path": ["**/details_harness|gsm8k|5_2024-02-15T07-29-36.560907.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-15T07-29-36.560907.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_15T07_29_36.560907", "path": ["**/details_harness|hellaswag|10_2024-02-15T07-29-36.560907.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-15T07-29-36.560907.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_15T07_29_36.560907", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-15T07-29-36.560907.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-15T07-29-36.560907.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-15T07-29-36.560907.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_15T07_29_36.560907", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T07-29-36.560907.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T07-29-36.560907.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_15T07_29_36.560907", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-15T07-29-36.560907.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-15T07-29-36.560907.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_15T07_29_36.560907", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-15T07-29-36.560907.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-15T07-29-36.560907.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_15T07_29_36.560907", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T07-29-36.560907.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T07-29-36.560907.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_15T07_29_36.560907", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T07-29-36.560907.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T07-29-36.560907.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_15T07_29_36.560907", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-15T07-29-36.560907.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-15T07-29-36.560907.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_15T07_29_36.560907", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T07-29-36.560907.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T07-29-36.560907.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_15T07_29_36.560907", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T07-29-36.560907.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T07-29-36.560907.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_15T07_29_36.560907", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T07-29-36.560907.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T07-29-36.560907.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_15T07_29_36.560907", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T07-29-36.560907.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T07-29-36.560907.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_15T07_29_36.560907", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-15T07-29-36.560907.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-15T07-29-36.560907.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_15T07_29_36.560907", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-15T07-29-36.560907.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-15T07-29-36.560907.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_15T07_29_36.560907", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T07-29-36.560907.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T07-29-36.560907.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_15T07_29_36.560907", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-15T07-29-36.560907.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-15T07-29-36.560907.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_15T07_29_36.560907", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T07-29-36.560907.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T07-29-36.560907.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_15T07_29_36.560907", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T07-29-36.560907.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T07-29-36.560907.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_15T07_29_36.560907", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T07-29-36.560907.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T07-29-36.560907.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_15T07_29_36.560907", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-15T07-29-36.560907.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-15T07-29-36.560907.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_15T07_29_36.560907", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T07-29-36.560907.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T07-29-36.560907.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_15T07_29_36.560907", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T07-29-36.560907.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T07-29-36.560907.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_15T07_29_36.560907", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T07-29-36.560907.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T07-29-36.560907.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_15T07_29_36.560907", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T07-29-36.560907.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T07-29-36.560907.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_15T07_29_36.560907", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T07-29-36.560907.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T07-29-36.560907.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_15T07_29_36.560907", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T07-29-36.560907.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T07-29-36.560907.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_15T07_29_36.560907", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T07-29-36.560907.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T07-29-36.560907.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_15T07_29_36.560907", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T07-29-36.560907.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T07-29-36.560907.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_15T07_29_36.560907", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T07-29-36.560907.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T07-29-36.560907.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_15T07_29_36.560907", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T07-29-36.560907.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T07-29-36.560907.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_15T07_29_36.560907", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T07-29-36.560907.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T07-29-36.560907.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_15T07_29_36.560907", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T07-29-36.560907.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T07-29-36.560907.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_15T07_29_36.560907", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T07-29-36.560907.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T07-29-36.560907.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_15T07_29_36.560907", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T07-29-36.560907.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T07-29-36.560907.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_15T07_29_36.560907", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-15T07-29-36.560907.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-15T07-29-36.560907.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_15T07_29_36.560907", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T07-29-36.560907.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T07-29-36.560907.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_15T07_29_36.560907", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-15T07-29-36.560907.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-15T07-29-36.560907.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_15T07_29_36.560907", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T07-29-36.560907.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T07-29-36.560907.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_15T07_29_36.560907", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T07-29-36.560907.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T07-29-36.560907.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_15T07_29_36.560907", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T07-29-36.560907.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T07-29-36.560907.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_15T07_29_36.560907", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-15T07-29-36.560907.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-15T07-29-36.560907.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_15T07_29_36.560907", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-15T07-29-36.560907.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-15T07-29-36.560907.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_15T07_29_36.560907", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T07-29-36.560907.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T07-29-36.560907.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_15T07_29_36.560907", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T07-29-36.560907.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T07-29-36.560907.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_15T07_29_36.560907", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T07-29-36.560907.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T07-29-36.560907.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_15T07_29_36.560907", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T07-29-36.560907.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T07-29-36.560907.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_15T07_29_36.560907", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-15T07-29-36.560907.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-15T07-29-36.560907.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_15T07_29_36.560907", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-15T07-29-36.560907.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-15T07-29-36.560907.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_15T07_29_36.560907", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-15T07-29-36.560907.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-15T07-29-36.560907.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_15T07_29_36.560907", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T07-29-36.560907.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T07-29-36.560907.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_15T07_29_36.560907", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-15T07-29-36.560907.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-15T07-29-36.560907.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_15T07_29_36.560907", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T07-29-36.560907.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T07-29-36.560907.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_15T07_29_36.560907", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T07-29-36.560907.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T07-29-36.560907.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_15T07_29_36.560907", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-15T07-29-36.560907.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-15T07-29-36.560907.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_15T07_29_36.560907", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-15T07-29-36.560907.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-15T07-29-36.560907.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_15T07_29_36.560907", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-15T07-29-36.560907.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-15T07-29-36.560907.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_15T07_29_36.560907", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T07-29-36.560907.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T07-29-36.560907.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_15T07_29_36.560907", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-15T07-29-36.560907.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-15T07-29-36.560907.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_15T07_29_36.560907", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-15T07-29-36.560907.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-15T07-29-36.560907.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_15T07_29_36.560907", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-15T07-29-36.560907.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-15T07-29-36.560907.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_15T07_29_36.560907", "path": ["**/details_harness|winogrande|5_2024-02-15T07-29-36.560907.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-15T07-29-36.560907.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_15T07_29_36.560907", "path": ["results_2024-02-15T07-29-36.560907.parquet"]}, {"split": "latest", "path": ["results_2024-02-15T07-29-36.560907.parquet"]}]}]} | 2024-02-15T07:32:05+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of aloobun/Reyna-Mini-1.8B-v0.1
Dataset automatically created during the evaluation run of model aloobun/Reyna-Mini-1.8B-v0.1 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-15T07:29:36.560907(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of aloobun/Reyna-Mini-1.8B-v0.1\n\n\n\nDataset automatically created during the evaluation run of model aloobun/Reyna-Mini-1.8B-v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-15T07:29:36.560907(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of aloobun/Reyna-Mini-1.8B-v0.1\n\n\n\nDataset automatically created during the evaluation run of model aloobun/Reyna-Mini-1.8B-v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-15T07:29:36.560907(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
dcde10aeb209010dbde97eee637354a1020b7bda | # Dataset Card for "wsd_myriade_avocat"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | gguichard/wsd_myriade_avocat | [
"region:us"
] | 2024-02-15T07:55:35+00:00 | {"dataset_info": {"features": [{"name": "tokens", "sequence": "string"}, {"name": "wn_sens", "sequence": "int64"}, {"name": "input_ids", "sequence": "int32"}, {"name": "attention_mask", "sequence": "int8"}, {"name": "labels", "sequence": "float64"}], "splits": [{"name": "train", "num_bytes": 22048, "num_examples": 42}], "download_size": 8176, "dataset_size": 22048}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-15T07:55:37+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "wsd_myriade_avocat"
More Information needed | [
"# Dataset Card for \"wsd_myriade_avocat\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"wsd_myriade_avocat\"\n\nMore Information needed"
] |
8697590d5d45d5e8ecdb7bce94ef8554952ef103 |
# Dataset Card for Evaluation run of tyson0420/mixtral_stack_llama
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [tyson0420/mixtral_stack_llama](https://huggingface.co/tyson0420/mixtral_stack_llama) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_tyson0420__mixtral_stack_llama",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-15T08:21:27.970055](https://huggingface.co/datasets/open-llm-leaderboard/details_tyson0420__mixtral_stack_llama/blob/main/results_2024-02-15T08-21-27.970055.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.28193427391738846,
"acc_stderr": 0.03169270439313508,
"acc_norm": 0.2845747380485041,
"acc_norm_stderr": 0.03252371590260296,
"mc1": 0.20563035495716034,
"mc1_stderr": 0.014148482219460972,
"mc2": 0.38221457050909724,
"mc2_stderr": 0.015352799377174492
},
"harness|arc:challenge|25": {
"acc": 0.302901023890785,
"acc_stderr": 0.013428241573185347,
"acc_norm": 0.3455631399317406,
"acc_norm_stderr": 0.013896938461145682
},
"harness|hellaswag|10": {
"acc": 0.37860983867755427,
"acc_stderr": 0.0048404936031662075,
"acc_norm": 0.5023899621589325,
"acc_norm_stderr": 0.004989724408664516
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.23703703703703705,
"acc_stderr": 0.03673731683969506,
"acc_norm": 0.23703703703703705,
"acc_norm_stderr": 0.03673731683969506
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.2565789473684211,
"acc_stderr": 0.0355418036802569,
"acc_norm": 0.2565789473684211,
"acc_norm_stderr": 0.0355418036802569
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.3018867924528302,
"acc_stderr": 0.028254200344438662,
"acc_norm": 0.3018867924528302,
"acc_norm_stderr": 0.028254200344438662
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.24305555555555555,
"acc_stderr": 0.03586879280080342,
"acc_norm": 0.24305555555555555,
"acc_norm_stderr": 0.03586879280080342
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.23699421965317918,
"acc_stderr": 0.03242414757483099,
"acc_norm": 0.23699421965317918,
"acc_norm_stderr": 0.03242414757483099
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.16666666666666666,
"acc_stderr": 0.03708284662416544,
"acc_norm": 0.16666666666666666,
"acc_norm_stderr": 0.03708284662416544
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3191489361702128,
"acc_stderr": 0.03047297336338004,
"acc_norm": 0.3191489361702128,
"acc_norm_stderr": 0.03047297336338004
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.04049339297748142,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.04049339297748142
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2827586206896552,
"acc_stderr": 0.03752833958003337,
"acc_norm": 0.2827586206896552,
"acc_norm_stderr": 0.03752833958003337
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.022644212615525218,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.022644212615525218
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.19047619047619047,
"acc_stderr": 0.035122074123020514,
"acc_norm": 0.19047619047619047,
"acc_norm_stderr": 0.035122074123020514
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2967741935483871,
"acc_stderr": 0.025988500792411894,
"acc_norm": 0.2967741935483871,
"acc_norm_stderr": 0.025988500792411894
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.270935960591133,
"acc_stderr": 0.031270907132977,
"acc_norm": 0.270935960591133,
"acc_norm_stderr": 0.031270907132977
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.03453131801885415,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.03453131801885415
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.3383838383838384,
"acc_stderr": 0.03371124142626304,
"acc_norm": 0.3383838383838384,
"acc_norm_stderr": 0.03371124142626304
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.34196891191709844,
"acc_stderr": 0.03423465100104282,
"acc_norm": 0.34196891191709844,
"acc_norm_stderr": 0.03423465100104282
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.31025641025641026,
"acc_stderr": 0.02345467488940429,
"acc_norm": 0.31025641025641026,
"acc_norm_stderr": 0.02345467488940429
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.02646611753895992,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.02646611753895992
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.030388353551886845,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.030388353551886845
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.03802039760107903,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.03802039760107903
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3559633027522936,
"acc_stderr": 0.020528559278244214,
"acc_norm": 0.3559633027522936,
"acc_norm_stderr": 0.020528559278244214
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4398148148148148,
"acc_stderr": 0.033851779760448106,
"acc_norm": 0.4398148148148148,
"acc_norm_stderr": 0.033851779760448106
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.24019607843137256,
"acc_stderr": 0.02998373305591361,
"acc_norm": 0.24019607843137256,
"acc_norm_stderr": 0.02998373305591361
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2911392405063291,
"acc_stderr": 0.029571601065753374,
"acc_norm": 0.2911392405063291,
"acc_norm_stderr": 0.029571601065753374
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.36771300448430494,
"acc_stderr": 0.03236198350928275,
"acc_norm": 0.36771300448430494,
"acc_norm_stderr": 0.03236198350928275
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.24427480916030533,
"acc_stderr": 0.03768335959728745,
"acc_norm": 0.24427480916030533,
"acc_norm_stderr": 0.03768335959728745
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2727272727272727,
"acc_stderr": 0.04065578140908705,
"acc_norm": 0.2727272727272727,
"acc_norm_stderr": 0.04065578140908705
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2392638036809816,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.2392638036809816,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.24107142857142858,
"acc_stderr": 0.040598672469526864,
"acc_norm": 0.24107142857142858,
"acc_norm_stderr": 0.040598672469526864
},
"harness|hendrycksTest-management|5": {
"acc": 0.33980582524271846,
"acc_stderr": 0.04689765937278135,
"acc_norm": 0.33980582524271846,
"acc_norm_stderr": 0.04689765937278135
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.25213675213675213,
"acc_stderr": 0.02844796547623102,
"acc_norm": 0.25213675213675213,
"acc_norm_stderr": 0.02844796547623102
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.28991060025542786,
"acc_stderr": 0.016225017944770957,
"acc_norm": 0.28991060025542786,
"acc_norm_stderr": 0.016225017944770957
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.29190751445086704,
"acc_stderr": 0.02447699407624732,
"acc_norm": 0.29190751445086704,
"acc_norm_stderr": 0.02447699407624732
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808864,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808864
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.32679738562091504,
"acc_stderr": 0.02685729466328141,
"acc_norm": 0.32679738562091504,
"acc_norm_stderr": 0.02685729466328141
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.29260450160771706,
"acc_stderr": 0.025839898334877983,
"acc_norm": 0.29260450160771706,
"acc_norm_stderr": 0.025839898334877983
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.3055555555555556,
"acc_stderr": 0.025630824975621344,
"acc_norm": 0.3055555555555556,
"acc_norm_stderr": 0.025630824975621344
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.0252578613594324,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.0252578613594324
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24967405475880053,
"acc_stderr": 0.01105453837783233,
"acc_norm": 0.24967405475880053,
"acc_norm_stderr": 0.01105453837783233
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.30514705882352944,
"acc_stderr": 0.027971541370170598,
"acc_norm": 0.30514705882352944,
"acc_norm_stderr": 0.027971541370170598
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.272875816993464,
"acc_stderr": 0.01802047414839358,
"acc_norm": 0.272875816993464,
"acc_norm_stderr": 0.01802047414839358
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2818181818181818,
"acc_stderr": 0.04309118709946459,
"acc_norm": 0.2818181818181818,
"acc_norm_stderr": 0.04309118709946459
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.23673469387755103,
"acc_stderr": 0.02721283588407316,
"acc_norm": 0.23673469387755103,
"acc_norm_stderr": 0.02721283588407316
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.263681592039801,
"acc_stderr": 0.031157150869355558,
"acc_norm": 0.263681592039801,
"acc_norm_stderr": 0.031157150869355558
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3192771084337349,
"acc_stderr": 0.0362933532994786,
"acc_norm": 0.3192771084337349,
"acc_norm_stderr": 0.0362933532994786
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.03126781714663179,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.03126781714663179
},
"harness|truthfulqa:mc|0": {
"mc1": 0.20563035495716034,
"mc1_stderr": 0.014148482219460972,
"mc2": 0.38221457050909724,
"mc2_stderr": 0.015352799377174492
},
"harness|winogrande|5": {
"acc": 0.5730071033938438,
"acc_stderr": 0.01390187807257506
},
"harness|gsm8k|5": {
"acc": 0.001516300227445034,
"acc_stderr": 0.0010717793485492606
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_tyson0420__mixtral_stack_llama | [
"region:us"
] | 2024-02-15T08:23:52+00:00 | {"pretty_name": "Evaluation run of tyson0420/mixtral_stack_llama", "dataset_summary": "Dataset automatically created during the evaluation run of model [tyson0420/mixtral_stack_llama](https://huggingface.co/tyson0420/mixtral_stack_llama) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_tyson0420__mixtral_stack_llama\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-15T08:21:27.970055](https://huggingface.co/datasets/open-llm-leaderboard/details_tyson0420__mixtral_stack_llama/blob/main/results_2024-02-15T08-21-27.970055.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.28193427391738846,\n \"acc_stderr\": 0.03169270439313508,\n \"acc_norm\": 0.2845747380485041,\n \"acc_norm_stderr\": 0.03252371590260296,\n \"mc1\": 0.20563035495716034,\n \"mc1_stderr\": 0.014148482219460972,\n \"mc2\": 0.38221457050909724,\n \"mc2_stderr\": 0.015352799377174492\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.302901023890785,\n \"acc_stderr\": 0.013428241573185347,\n \"acc_norm\": 0.3455631399317406,\n \"acc_norm_stderr\": 0.013896938461145682\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.37860983867755427,\n \"acc_stderr\": 0.0048404936031662075,\n \"acc_norm\": 0.5023899621589325,\n \"acc_norm_stderr\": 0.004989724408664516\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542129,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542129\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.23703703703703705,\n \"acc_stderr\": 0.03673731683969506,\n \"acc_norm\": 0.23703703703703705,\n \"acc_norm_stderr\": 0.03673731683969506\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.2565789473684211,\n \"acc_stderr\": 0.0355418036802569,\n \"acc_norm\": 0.2565789473684211,\n \"acc_norm_stderr\": 0.0355418036802569\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.3018867924528302,\n \"acc_stderr\": 0.028254200344438662,\n \"acc_norm\": 0.3018867924528302,\n \"acc_norm_stderr\": 0.028254200344438662\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.24305555555555555,\n \"acc_stderr\": 0.03586879280080342,\n \"acc_norm\": 0.24305555555555555,\n \"acc_norm_stderr\": 0.03586879280080342\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23699421965317918,\n \"acc_stderr\": 0.03242414757483099,\n \"acc_norm\": 0.23699421965317918,\n \"acc_norm_stderr\": 0.03242414757483099\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.16666666666666666,\n \"acc_stderr\": 0.03708284662416544,\n \"acc_norm\": 0.16666666666666666,\n \"acc_norm_stderr\": 0.03708284662416544\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536955,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536955\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.3191489361702128,\n \"acc_stderr\": 0.03047297336338004,\n \"acc_norm\": 0.3191489361702128,\n \"acc_norm_stderr\": 0.03047297336338004\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n \"acc_stderr\": 0.04049339297748142,\n \"acc_norm\": 0.24561403508771928,\n \"acc_norm_stderr\": 0.04049339297748142\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2827586206896552,\n \"acc_stderr\": 0.03752833958003337,\n \"acc_norm\": 0.2827586206896552,\n \"acc_norm_stderr\": 0.03752833958003337\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2619047619047619,\n \"acc_stderr\": 0.022644212615525218,\n \"acc_norm\": 0.2619047619047619,\n \"acc_norm_stderr\": 0.022644212615525218\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.19047619047619047,\n \"acc_stderr\": 0.035122074123020514,\n \"acc_norm\": 0.19047619047619047,\n \"acc_norm_stderr\": 0.035122074123020514\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2967741935483871,\n \"acc_stderr\": 0.025988500792411894,\n \"acc_norm\": 0.2967741935483871,\n \"acc_norm_stderr\": 0.025988500792411894\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.270935960591133,\n \"acc_stderr\": 0.031270907132977,\n \"acc_norm\": 0.270935960591133,\n \"acc_norm_stderr\": 0.031270907132977\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.03453131801885415,\n \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.03453131801885415\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.3383838383838384,\n \"acc_stderr\": 0.03371124142626304,\n \"acc_norm\": 0.3383838383838384,\n \"acc_norm_stderr\": 0.03371124142626304\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.34196891191709844,\n \"acc_stderr\": 0.03423465100104282,\n \"acc_norm\": 0.34196891191709844,\n \"acc_norm_stderr\": 0.03423465100104282\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.31025641025641026,\n \"acc_stderr\": 0.02345467488940429,\n \"acc_norm\": 0.31025641025641026,\n \"acc_norm_stderr\": 0.02345467488940429\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2518518518518518,\n \"acc_stderr\": 0.02646611753895992,\n \"acc_norm\": 0.2518518518518518,\n \"acc_norm_stderr\": 0.02646611753895992\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.030388353551886845,\n \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.030388353551886845\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31788079470198677,\n \"acc_stderr\": 0.03802039760107903,\n \"acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.03802039760107903\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.3559633027522936,\n \"acc_stderr\": 0.020528559278244214,\n \"acc_norm\": 0.3559633027522936,\n \"acc_norm_stderr\": 0.020528559278244214\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4398148148148148,\n \"acc_stderr\": 0.033851779760448106,\n \"acc_norm\": 0.4398148148148148,\n \"acc_norm_stderr\": 0.033851779760448106\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.24019607843137256,\n \"acc_stderr\": 0.02998373305591361,\n \"acc_norm\": 0.24019607843137256,\n \"acc_norm_stderr\": 0.02998373305591361\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.2911392405063291,\n \"acc_stderr\": 0.029571601065753374,\n \"acc_norm\": 0.2911392405063291,\n \"acc_norm_stderr\": 0.029571601065753374\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.36771300448430494,\n \"acc_stderr\": 0.03236198350928275,\n \"acc_norm\": 0.36771300448430494,\n \"acc_norm_stderr\": 0.03236198350928275\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.24427480916030533,\n \"acc_stderr\": 0.03768335959728745,\n \"acc_norm\": 0.24427480916030533,\n \"acc_norm_stderr\": 0.03768335959728745\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2727272727272727,\n \"acc_stderr\": 0.04065578140908705,\n \"acc_norm\": 0.2727272727272727,\n \"acc_norm_stderr\": 0.04065578140908705\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.2392638036809816,\n \"acc_stderr\": 0.033519538795212696,\n \"acc_norm\": 0.2392638036809816,\n \"acc_norm_stderr\": 0.033519538795212696\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.24107142857142858,\n \"acc_stderr\": 0.040598672469526864,\n \"acc_norm\": 0.24107142857142858,\n \"acc_norm_stderr\": 0.040598672469526864\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.33980582524271846,\n \"acc_stderr\": 0.04689765937278135,\n \"acc_norm\": 0.33980582524271846,\n \"acc_norm_stderr\": 0.04689765937278135\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.25213675213675213,\n \"acc_stderr\": 0.02844796547623102,\n \"acc_norm\": 0.25213675213675213,\n \"acc_norm_stderr\": 0.02844796547623102\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.28991060025542786,\n \"acc_stderr\": 0.016225017944770957,\n \"acc_norm\": 0.28991060025542786,\n \"acc_norm_stderr\": 0.016225017944770957\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.29190751445086704,\n \"acc_stderr\": 0.02447699407624732,\n \"acc_norm\": 0.29190751445086704,\n \"acc_norm_stderr\": 0.02447699407624732\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n \"acc_stderr\": 0.014422292204808864,\n \"acc_norm\": 0.24692737430167597,\n \"acc_norm_stderr\": 0.014422292204808864\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.32679738562091504,\n \"acc_stderr\": 0.02685729466328141,\n \"acc_norm\": 0.32679738562091504,\n \"acc_norm_stderr\": 0.02685729466328141\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.29260450160771706,\n \"acc_stderr\": 0.025839898334877983,\n \"acc_norm\": 0.29260450160771706,\n \"acc_norm_stderr\": 0.025839898334877983\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.3055555555555556,\n \"acc_stderr\": 0.025630824975621344,\n \"acc_norm\": 0.3055555555555556,\n \"acc_norm_stderr\": 0.025630824975621344\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.23404255319148937,\n \"acc_stderr\": 0.0252578613594324,\n \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.0252578613594324\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24967405475880053,\n \"acc_stderr\": 0.01105453837783233,\n \"acc_norm\": 0.24967405475880053,\n \"acc_norm_stderr\": 0.01105453837783233\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.30514705882352944,\n \"acc_stderr\": 0.027971541370170598,\n \"acc_norm\": 0.30514705882352944,\n \"acc_norm_stderr\": 0.027971541370170598\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.272875816993464,\n \"acc_stderr\": 0.01802047414839358,\n \"acc_norm\": 0.272875816993464,\n \"acc_norm_stderr\": 0.01802047414839358\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2818181818181818,\n \"acc_stderr\": 0.04309118709946459,\n \"acc_norm\": 0.2818181818181818,\n \"acc_norm_stderr\": 0.04309118709946459\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.23673469387755103,\n \"acc_stderr\": 0.02721283588407316,\n \"acc_norm\": 0.23673469387755103,\n \"acc_norm_stderr\": 0.02721283588407316\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.263681592039801,\n \"acc_stderr\": 0.031157150869355558,\n \"acc_norm\": 0.263681592039801,\n \"acc_norm_stderr\": 0.031157150869355558\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3192771084337349,\n \"acc_stderr\": 0.0362933532994786,\n \"acc_norm\": 0.3192771084337349,\n \"acc_norm_stderr\": 0.0362933532994786\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.03126781714663179,\n \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.03126781714663179\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.20563035495716034,\n \"mc1_stderr\": 0.014148482219460972,\n \"mc2\": 0.38221457050909724,\n \"mc2_stderr\": 0.015352799377174492\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5730071033938438,\n \"acc_stderr\": 0.01390187807257506\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.001516300227445034,\n \"acc_stderr\": 0.0010717793485492606\n }\n}\n```", "repo_url": "https://huggingface.co/tyson0420/mixtral_stack_llama", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_15T08_21_27.970055", "path": ["**/details_harness|arc:challenge|25_2024-02-15T08-21-27.970055.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-15T08-21-27.970055.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_15T08_21_27.970055", "path": ["**/details_harness|gsm8k|5_2024-02-15T08-21-27.970055.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-15T08-21-27.970055.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_15T08_21_27.970055", "path": ["**/details_harness|hellaswag|10_2024-02-15T08-21-27.970055.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-15T08-21-27.970055.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_15T08_21_27.970055", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-15T08-21-27.970055.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-15T08-21-27.970055.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-15T08-21-27.970055.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_15T08_21_27.970055", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T08-21-27.970055.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T08-21-27.970055.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_15T08_21_27.970055", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-15T08-21-27.970055.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-15T08-21-27.970055.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_15T08_21_27.970055", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-15T08-21-27.970055.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-15T08-21-27.970055.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_15T08_21_27.970055", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T08-21-27.970055.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T08-21-27.970055.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_15T08_21_27.970055", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T08-21-27.970055.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T08-21-27.970055.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_15T08_21_27.970055", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-15T08-21-27.970055.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-15T08-21-27.970055.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_15T08_21_27.970055", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T08-21-27.970055.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T08-21-27.970055.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_15T08_21_27.970055", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T08-21-27.970055.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T08-21-27.970055.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_15T08_21_27.970055", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T08-21-27.970055.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T08-21-27.970055.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_15T08_21_27.970055", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T08-21-27.970055.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T08-21-27.970055.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_15T08_21_27.970055", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-15T08-21-27.970055.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-15T08-21-27.970055.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_15T08_21_27.970055", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-15T08-21-27.970055.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-15T08-21-27.970055.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_15T08_21_27.970055", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T08-21-27.970055.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T08-21-27.970055.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_15T08_21_27.970055", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-15T08-21-27.970055.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-15T08-21-27.970055.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_15T08_21_27.970055", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T08-21-27.970055.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T08-21-27.970055.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_15T08_21_27.970055", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T08-21-27.970055.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T08-21-27.970055.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_15T08_21_27.970055", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T08-21-27.970055.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T08-21-27.970055.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_15T08_21_27.970055", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-15T08-21-27.970055.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-15T08-21-27.970055.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_15T08_21_27.970055", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T08-21-27.970055.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T08-21-27.970055.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_15T08_21_27.970055", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T08-21-27.970055.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T08-21-27.970055.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_15T08_21_27.970055", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T08-21-27.970055.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T08-21-27.970055.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_15T08_21_27.970055", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T08-21-27.970055.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T08-21-27.970055.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_15T08_21_27.970055", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T08-21-27.970055.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T08-21-27.970055.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_15T08_21_27.970055", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T08-21-27.970055.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T08-21-27.970055.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_15T08_21_27.970055", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T08-21-27.970055.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T08-21-27.970055.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_15T08_21_27.970055", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T08-21-27.970055.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T08-21-27.970055.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_15T08_21_27.970055", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T08-21-27.970055.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T08-21-27.970055.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_15T08_21_27.970055", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T08-21-27.970055.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T08-21-27.970055.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_15T08_21_27.970055", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T08-21-27.970055.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T08-21-27.970055.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_15T08_21_27.970055", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T08-21-27.970055.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T08-21-27.970055.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_15T08_21_27.970055", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T08-21-27.970055.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T08-21-27.970055.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_15T08_21_27.970055", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T08-21-27.970055.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T08-21-27.970055.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_15T08_21_27.970055", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-15T08-21-27.970055.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-15T08-21-27.970055.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_15T08_21_27.970055", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T08-21-27.970055.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T08-21-27.970055.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_15T08_21_27.970055", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-15T08-21-27.970055.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-15T08-21-27.970055.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_15T08_21_27.970055", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T08-21-27.970055.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T08-21-27.970055.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_15T08_21_27.970055", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T08-21-27.970055.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T08-21-27.970055.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_15T08_21_27.970055", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T08-21-27.970055.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T08-21-27.970055.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_15T08_21_27.970055", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-15T08-21-27.970055.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-15T08-21-27.970055.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_15T08_21_27.970055", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-15T08-21-27.970055.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-15T08-21-27.970055.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_15T08_21_27.970055", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T08-21-27.970055.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T08-21-27.970055.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_15T08_21_27.970055", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T08-21-27.970055.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T08-21-27.970055.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_15T08_21_27.970055", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T08-21-27.970055.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T08-21-27.970055.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_15T08_21_27.970055", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T08-21-27.970055.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T08-21-27.970055.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_15T08_21_27.970055", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-15T08-21-27.970055.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-15T08-21-27.970055.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_15T08_21_27.970055", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-15T08-21-27.970055.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-15T08-21-27.970055.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_15T08_21_27.970055", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-15T08-21-27.970055.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-15T08-21-27.970055.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_15T08_21_27.970055", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T08-21-27.970055.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T08-21-27.970055.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_15T08_21_27.970055", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-15T08-21-27.970055.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-15T08-21-27.970055.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_15T08_21_27.970055", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T08-21-27.970055.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T08-21-27.970055.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_15T08_21_27.970055", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T08-21-27.970055.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T08-21-27.970055.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_15T08_21_27.970055", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-15T08-21-27.970055.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-15T08-21-27.970055.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_15T08_21_27.970055", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-15T08-21-27.970055.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-15T08-21-27.970055.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_15T08_21_27.970055", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-15T08-21-27.970055.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-15T08-21-27.970055.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_15T08_21_27.970055", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T08-21-27.970055.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T08-21-27.970055.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_15T08_21_27.970055", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-15T08-21-27.970055.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-15T08-21-27.970055.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_15T08_21_27.970055", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-15T08-21-27.970055.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-15T08-21-27.970055.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_15T08_21_27.970055", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-15T08-21-27.970055.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-15T08-21-27.970055.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_15T08_21_27.970055", "path": ["**/details_harness|winogrande|5_2024-02-15T08-21-27.970055.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-15T08-21-27.970055.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_15T08_21_27.970055", "path": ["results_2024-02-15T08-21-27.970055.parquet"]}, {"split": "latest", "path": ["results_2024-02-15T08-21-27.970055.parquet"]}]}]} | 2024-02-15T08:24:14+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of tyson0420/mixtral_stack_llama
Dataset automatically created during the evaluation run of model tyson0420/mixtral_stack_llama on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-15T08:21:27.970055(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of tyson0420/mixtral_stack_llama\n\n\n\nDataset automatically created during the evaluation run of model tyson0420/mixtral_stack_llama on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-15T08:21:27.970055(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of tyson0420/mixtral_stack_llama\n\n\n\nDataset automatically created during the evaluation run of model tyson0420/mixtral_stack_llama on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-15T08:21:27.970055(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
8fd620cd1919bf0afe8936a6d5355d8168b1f7d3 |
This is a mixed dataset between Finance domain QA and General QA with the ratio 1:1.
- [Finance dataset](https://huggingface.co/datasets/jan-hq/finance_alpaca_binarized)
- [General dataset](https://huggingface.co/datasets/jan-hq/openhermes-2.5_binarized) | jan-hq/finance_mixed_50_binarized | [
"region:us"
] | 2024-02-15T08:40:50+00:00 | {"dataset_info": {"features": [{"name": "messages", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 190587540.62714094, "num_examples": 125117}, {"name": "test", "num_bytes": 162744958, "num_examples": 107048}], "download_size": 158985767, "dataset_size": 353332498.62714094}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2024-02-15T08:45:28+00:00 | [] | [] | TAGS
#region-us
|
This is a mixed dataset between Finance domain QA and General QA with the ratio 1:1.
- Finance dataset
- General dataset | [] | [
"TAGS\n#region-us \n"
] |
f56463e0be5188d087a26e0141d9eac85d288ae1 | # Dataset Card for Alpaca-Cleaned-bn
<!-- Provide a quick summary of the dataset. -->
This is a cleaned bengali translated version of the original Alpaca Dataset released by Stanford.
## Uses
```python
import datasets
dataset = datasets.load_dataset("abrarfahim/alpaca-cleaned-bn")
print(dataset[0])
```
## Dataset Structure
```
{'system_prompt': 'You are a virtual assistant, deliver a comprehensive response.',
'qas_id': 'YY9S5K',
'question_text': '"সন্দেহ" শব্দের সঠিক প্রতিশব্দ নির্বাচন করুন।',
'orig_answer_texts': '"সন্দেহ" শব্দের প্রতিশব্দের মধ্যে সন্দেহ, সন্দেহ, অবিশ্বাস, অবিশ্বাস, অসম্মান এবং প্রশ্ন অন্তর্ভুক্ত থাকতে পারে। কিছু প্রসঙ্গ সবচেয়ে উপযুক্ত প্রতিশব্দ চয়ন করতে সহায়ক হবে।'}
``` | abrarfahim/alpaca-cleaned-bn | [
"task_categories:question-answering",
"task_categories:text-generation",
"size_categories:10K<n<100K",
"language:bn",
"license:apache-2.0",
"instruction-finetuning",
"bengali",
"bangla",
"region:us"
] | 2024-02-15T08:44:19+00:00 | {"language": ["bn"], "license": "apache-2.0", "size_categories": ["10K<n<100K"], "task_categories": ["question-answering", "text-generation"], "pretty_name": "Alpaca Cleaned Bengali", "dataset_info": {"features": [{"name": "system_prompt", "dtype": "string"}, {"name": "qas_id", "dtype": "string"}, {"name": "question_text", "dtype": "string"}, {"name": "orig_answer_texts", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 80016291, "num_examples": 45622}], "download_size": 29404589, "dataset_size": 80016291}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "tags": ["instruction-finetuning", "bengali", "bangla"]} | 2024-02-15T09:36:55+00:00 | [] | [
"bn"
] | TAGS
#task_categories-question-answering #task_categories-text-generation #size_categories-10K<n<100K #language-Bengali #license-apache-2.0 #instruction-finetuning #bengali #bangla #region-us
| # Dataset Card for Alpaca-Cleaned-bn
This is a cleaned bengali translated version of the original Alpaca Dataset released by Stanford.
## Uses
## Dataset Structure
| [
"# Dataset Card for Alpaca-Cleaned-bn\n\n\n\nThis is a cleaned bengali translated version of the original Alpaca Dataset released by Stanford.",
"## Uses",
"## Dataset Structure"
] | [
"TAGS\n#task_categories-question-answering #task_categories-text-generation #size_categories-10K<n<100K #language-Bengali #license-apache-2.0 #instruction-finetuning #bengali #bangla #region-us \n",
"# Dataset Card for Alpaca-Cleaned-bn\n\n\n\nThis is a cleaned bengali translated version of the original Alpaca Dataset released by Stanford.",
"## Uses",
"## Dataset Structure"
] |
005c764862bdb684345730ab677ba45d2e2ec88e | This is a mixed dataset between Finance domain QA and General QA with the ratio 3:7 (30% Finance : 70% General).
- [Finance dataset](https://huggingface.co/datasets/jan-hq/finance_alpaca_binarized)
- [General dataset](https://huggingface.co/datasets/jan-hq/openhermes-2.5_binarized) | jan-hq/finance_mixed_70_binarized | [
"region:us"
] | 2024-02-15T08:48:07+00:00 | {"dataset_info": {"features": [{"name": "messages", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 314164710.963046, "num_examples": 206243}, {"name": "test", "num_bytes": 162744958, "num_examples": 107048}], "download_size": 226199351, "dataset_size": 476909668.963046}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2024-02-15T08:50:24+00:00 | [] | [] | TAGS
#region-us
| This is a mixed dataset between Finance domain QA and General QA with the ratio 3:7 (30% Finance : 70% General).
- Finance dataset
- General dataset | [] | [
"TAGS\n#region-us \n"
] |
69cb07f64e4c54be0c5f77bdc679e55a9bead440 | # Sources
* [Binary classification for DWT-DCT-SVD watermarks](https://www.kaggle.com/code/thonypythony/binary-classification-for-dctandsvd-watermarks)
* [Blind watermark based on DWT-DCT-SVD](https://github.com/unton3ton/mijn_blind_watermark) | thonypythony/Binary_classification_for_DCTandSVD_watermarks | [
"region:us"
] | 2024-02-15T08:58:26+00:00 | {} | 2024-02-15T09:08:10+00:00 | [] | [] | TAGS
#region-us
| # Sources
* Binary classification for DWT-DCT-SVD watermarks
* Blind watermark based on DWT-DCT-SVD | [
"# Sources\n\n* Binary classification for DWT-DCT-SVD watermarks\n* Blind watermark based on DWT-DCT-SVD"
] | [
"TAGS\n#region-us \n",
"# Sources\n\n* Binary classification for DWT-DCT-SVD watermarks\n* Blind watermark based on DWT-DCT-SVD"
] |
df034b2cacf6f5dde85c061d517bef160fc33d83 |
# Description
The Spam Douban Movie Reviews Dataset is a collection of movie reviews scraped from Douban, a popular Chinese social networking platform for movie enthusiasts. This dataset consists of reviews that have been manually classified as either spam or genuine by human reviewers. It contains a total of 1,600 data.
This dataset is created for our project **[Spam Movie Reviews Detection through Supervised Learning](https://github.com/tracywong117/Spam-Movie-Reviews-Detection)**.
| tracywong117/spam-douban-movie-review | [
"task_categories:text-classification",
"size_categories:1K<n<10K",
"language:zh",
"license:mit",
"art",
"Spam detection",
"region:us"
] | 2024-02-15T08:59:50+00:00 | {"language": ["zh"], "license": "mit", "size_categories": ["1K<n<10K"], "task_categories": ["text-classification"], "tags": ["art", "Spam detection"]} | 2024-02-15T09:09:24+00:00 | [] | [
"zh"
] | TAGS
#task_categories-text-classification #size_categories-1K<n<10K #language-Chinese #license-mit #art #Spam detection #region-us
|
# Description
The Spam Douban Movie Reviews Dataset is a collection of movie reviews scraped from Douban, a popular Chinese social networking platform for movie enthusiasts. This dataset consists of reviews that have been manually classified as either spam or genuine by human reviewers. It contains a total of 1,600 data.
This dataset is created for our project Spam Movie Reviews Detection through Supervised Learning.
| [
"# Description\nThe Spam Douban Movie Reviews Dataset is a collection of movie reviews scraped from Douban, a popular Chinese social networking platform for movie enthusiasts. This dataset consists of reviews that have been manually classified as either spam or genuine by human reviewers. It contains a total of 1,600 data. \nThis dataset is created for our project Spam Movie Reviews Detection through Supervised Learning."
] | [
"TAGS\n#task_categories-text-classification #size_categories-1K<n<10K #language-Chinese #license-mit #art #Spam detection #region-us \n",
"# Description\nThe Spam Douban Movie Reviews Dataset is a collection of movie reviews scraped from Douban, a popular Chinese social networking platform for movie enthusiasts. This dataset consists of reviews that have been manually classified as either spam or genuine by human reviewers. It contains a total of 1,600 data. \nThis dataset is created for our project Spam Movie Reviews Detection through Supervised Learning."
] |
545110d00a5f44ac8c24311ba1b91e8e21681417 | # Dataset Card for "wsd_myriade_synth_data_id_label_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | gguichard/wsd_myriade_synth_data_id_label_test | [
"region:us"
] | 2024-02-15T09:10:16+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "tokens", "sequence": "string"}, {"name": "wn_sens", "sequence": "int64"}, {"name": "input_ids", "sequence": "int32"}, {"name": "attention_mask", "sequence": "int8"}, {"name": "labels", "sequence": "int64"}], "splits": [{"name": "train", "num_bytes": 340874.41102362203, "num_examples": 571}, {"name": "test", "num_bytes": 38206.588976377956, "num_examples": 64}], "download_size": 84446, "dataset_size": 379081.0}} | 2024-02-15T09:10:21+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "wsd_myriade_synth_data_id_label_test"
More Information needed | [
"# Dataset Card for \"wsd_myriade_synth_data_id_label_test\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"wsd_myriade_synth_data_id_label_test\"\n\nMore Information needed"
] |
46e097f836e13892d2a87ce480a1ded3c358fcef | # Dataset Card for "wsd_myriade_synth_data_id_label_total"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | gguichard/wsd_myriade_synth_data_id_label_total | [
"region:us"
] | 2024-02-15T09:45:45+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "tokens", "sequence": "string"}, {"name": "wn_sens", "sequence": "int64"}, {"name": "input_ids", "sequence": "int32"}, {"name": "attention_mask", "sequence": "int8"}, {"name": "labels", "sequence": "int64"}], "splits": [{"name": "train", "num_bytes": 51147806.48548672, "num_examples": 91188}, {"name": "test", "num_bytes": 5683650.514513279, "num_examples": 10133}], "download_size": 14307277, "dataset_size": 56831457.0}} | 2024-02-15T09:45:53+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "wsd_myriade_synth_data_id_label_total"
More Information needed | [
"# Dataset Card for \"wsd_myriade_synth_data_id_label_total\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"wsd_myriade_synth_data_id_label_total\"\n\nMore Information needed"
] |
b35c485e99ad9e9a598321436a96228995285855 |
Data taken from http://comp.komicorpora.ru/
Book: Л. Г. Терехова и В. Г. Эрдели. География. Часть первая (1938) | udmurtNLP/soviet-geography-book-rus-udm-parallel-corpora | [
"task_categories:translation",
"size_categories:1K<n<10K",
"language:udm",
"license:apache-2.0",
"region:us"
] | 2024-02-15T09:50:30+00:00 | {"language": ["udm"], "license": "apache-2.0", "size_categories": ["1K<n<10K"], "task_categories": ["translation"], "dataset_info": {"features": [{"name": "rus", "dtype": "string"}, {"name": "udm", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 604740, "num_examples": 2783}], "download_size": 298539, "dataset_size": 604740}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-15T09:59:03+00:00 | [] | [
"udm"
] | TAGS
#task_categories-translation #size_categories-1K<n<10K #language-Udmurt #license-apache-2.0 #region-us
|
Data taken from URL
Book: Л. Г. Терехова и В. Г. Эрдели. География. Часть первая (1938) | [] | [
"TAGS\n#task_categories-translation #size_categories-1K<n<10K #language-Udmurt #license-apache-2.0 #region-us \n"
] |
d15773c6445e01b1a1c9bfb50c314b6f508ff692 |
This repository contains the embeddings used by https://huggingface.co/spaces/etrotta/kanji_lookup
The embeddings were generated by:
1) Generating synthetic Kanji images using multiple different fonts then
2) Encoding these images into using a Neural Network
For one example use case, you can use them to search for embeddings similar to handdrawn Kanji images as demonstrated in the space
The neural network used was the ViTModel encoder from https://huggingface.co/kha-white/manga-ocr-base
The parquet contains the following fields:
- font: String form of the font used to generate this embedding, encoded as an arrow Dictionary
- kanji: String form of the Kanji this embedding represents
- embedding: Tensor of size 768, encoded as an arrow list(float32) of fixed size
For more information, including the list of fonts and kanji used, as well as more information on how to use the dataset, see https://github.com/etrotta/kanji_lookup
| etrotta/kanji_embeddings | [
"size_categories:10K<n<100K",
"language:ja",
"license:mit",
"region:us"
] | 2024-02-15T11:11:15+00:00 | {"language": ["ja"], "license": "mit", "size_categories": ["10K<n<100K"], "pretty_name": "Kanji Image ViT Embeddings"} | 2024-02-16T00:35:41+00:00 | [] | [
"ja"
] | TAGS
#size_categories-10K<n<100K #language-Japanese #license-mit #region-us
|
This repository contains the embeddings used by URL
The embeddings were generated by:
1) Generating synthetic Kanji images using multiple different fonts then
2) Encoding these images into using a Neural Network
For one example use case, you can use them to search for embeddings similar to handdrawn Kanji images as demonstrated in the space
The neural network used was the ViTModel encoder from URL
The parquet contains the following fields:
- font: String form of the font used to generate this embedding, encoded as an arrow Dictionary
- kanji: String form of the Kanji this embedding represents
- embedding: Tensor of size 768, encoded as an arrow list(float32) of fixed size
For more information, including the list of fonts and kanji used, as well as more information on how to use the dataset, see URL
| [] | [
"TAGS\n#size_categories-10K<n<100K #language-Japanese #license-mit #region-us \n"
] |
4306f05755a113ef4998a9426f192215a60ddfe7 |
# Dataset Card for Evaluation run of Yuma42/KangalKhan-Sapphire-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Yuma42/KangalKhan-Sapphire-7B](https://huggingface.co/Yuma42/KangalKhan-Sapphire-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Yuma42__KangalKhan-Sapphire-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-15T11:23:46.531546](https://huggingface.co/datasets/open-llm-leaderboard/details_Yuma42__KangalKhan-Sapphire-7B/blob/main/results_2024-02-15T11-23-46.531546.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6358309379887943,
"acc_stderr": 0.03223739791661606,
"acc_norm": 0.637443452667424,
"acc_norm_stderr": 0.032881114830308686,
"mc1": 0.3929008567931457,
"mc1_stderr": 0.017097248285233065,
"mc2": 0.5609459047030728,
"mc2_stderr": 0.015392383178013528
},
"harness|arc:challenge|25": {
"acc": 0.6254266211604096,
"acc_stderr": 0.014144193471893456,
"acc_norm": 0.6629692832764505,
"acc_norm_stderr": 0.013813476652902276
},
"harness|hellaswag|10": {
"acc": 0.6666998605855408,
"acc_stderr": 0.004704293898729911,
"acc_norm": 0.853415654252141,
"acc_norm_stderr": 0.0035296822858572325
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.562962962962963,
"acc_stderr": 0.042849586397534015,
"acc_norm": 0.562962962962963,
"acc_norm_stderr": 0.042849586397534015
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6754716981132075,
"acc_stderr": 0.02881561571343211,
"acc_norm": 0.6754716981132075,
"acc_norm_stderr": 0.02881561571343211
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6011560693641619,
"acc_stderr": 0.037336266553835096,
"acc_norm": 0.6011560693641619,
"acc_norm_stderr": 0.037336266553835096
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.04784060704105653,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.04784060704105653
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5574468085106383,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.5574468085106383,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.0470070803355104,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.0470070803355104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.025402555503260912,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.025402555503260912
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677172,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.023287665127268552,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.023287665127268552
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.793939393939394,
"acc_stderr": 0.03158415324047711,
"acc_norm": 0.793939393939394,
"acc_norm_stderr": 0.03158415324047711
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.029376616484945633,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.029376616484945633
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.023381935348121437,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.023381935348121437
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6128205128205129,
"acc_stderr": 0.024697216930878934,
"acc_norm": 0.6128205128205129,
"acc_norm_stderr": 0.024697216930878934
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028597,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028597
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.030283995525884396,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.030283995525884396
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.03861557546255169,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.03861557546255169
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8311926605504587,
"acc_stderr": 0.016060056268530336,
"acc_norm": 0.8311926605504587,
"acc_norm_stderr": 0.016060056268530336
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.028379449451588667,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.028379449451588667
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.030769352008229136,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.030769352008229136
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.035477710041594654,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.035477710041594654
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742179,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742179
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165616,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165616
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8263090676883781,
"acc_stderr": 0.01354741565866226,
"acc_norm": 0.8263090676883781,
"acc_norm_stderr": 0.01354741565866226
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7138728323699421,
"acc_stderr": 0.02433214677913413,
"acc_norm": 0.7138728323699421,
"acc_norm_stderr": 0.02433214677913413
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.329608938547486,
"acc_stderr": 0.01572153107518387,
"acc_norm": 0.329608938547486,
"acc_norm_stderr": 0.01572153107518387
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.738562091503268,
"acc_stderr": 0.025160998214292452,
"acc_norm": 0.738562091503268,
"acc_norm_stderr": 0.025160998214292452
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.02608270069539966,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.02608270069539966
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7438271604938271,
"acc_stderr": 0.024288533637726095,
"acc_norm": 0.7438271604938271,
"acc_norm_stderr": 0.024288533637726095
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5070921985815603,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.5070921985815603,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46740547588005216,
"acc_stderr": 0.01274307294265334,
"acc_norm": 0.46740547588005216,
"acc_norm_stderr": 0.01274307294265334
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.028245687391462937,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.028245687391462937
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6699346405228758,
"acc_stderr": 0.019023726160724553,
"acc_norm": 0.6699346405228758,
"acc_norm_stderr": 0.019023726160724553
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784593,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784593
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8109452736318408,
"acc_stderr": 0.027686913588013007,
"acc_norm": 0.8109452736318408,
"acc_norm_stderr": 0.027686913588013007
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.02796678585916089,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.02796678585916089
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3929008567931457,
"mc1_stderr": 0.017097248285233065,
"mc2": 0.5609459047030728,
"mc2_stderr": 0.015392383178013528
},
"harness|winogrande|5": {
"acc": 0.7813733228097869,
"acc_stderr": 0.011616198215773234
},
"harness|gsm8k|5": {
"acc": 0.6194086429112965,
"acc_stderr": 0.013373971277729817
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Yuma42__KangalKhan-Sapphire-7B | [
"region:us"
] | 2024-02-15T11:26:04+00:00 | {"pretty_name": "Evaluation run of Yuma42/KangalKhan-Sapphire-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [Yuma42/KangalKhan-Sapphire-7B](https://huggingface.co/Yuma42/KangalKhan-Sapphire-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Yuma42__KangalKhan-Sapphire-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-15T11:23:46.531546](https://huggingface.co/datasets/open-llm-leaderboard/details_Yuma42__KangalKhan-Sapphire-7B/blob/main/results_2024-02-15T11-23-46.531546.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6358309379887943,\n \"acc_stderr\": 0.03223739791661606,\n \"acc_norm\": 0.637443452667424,\n \"acc_norm_stderr\": 0.032881114830308686,\n \"mc1\": 0.3929008567931457,\n \"mc1_stderr\": 0.017097248285233065,\n \"mc2\": 0.5609459047030728,\n \"mc2_stderr\": 0.015392383178013528\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6254266211604096,\n \"acc_stderr\": 0.014144193471893456,\n \"acc_norm\": 0.6629692832764505,\n \"acc_norm_stderr\": 0.013813476652902276\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6666998605855408,\n \"acc_stderr\": 0.004704293898729911,\n \"acc_norm\": 0.853415654252141,\n \"acc_norm_stderr\": 0.0035296822858572325\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n \"acc_stderr\": 0.042849586397534015,\n \"acc_norm\": 0.562962962962963,\n \"acc_norm_stderr\": 0.042849586397534015\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6754716981132075,\n \"acc_stderr\": 0.02881561571343211,\n \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.02881561571343211\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909282,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909282\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6011560693641619,\n \"acc_stderr\": 0.037336266553835096,\n \"acc_norm\": 0.6011560693641619,\n \"acc_norm_stderr\": 0.037336266553835096\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105653,\n \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105653\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.03246956919789958,\n \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.03246956919789958\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.0470070803355104,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.0470070803355104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41798941798941797,\n \"acc_stderr\": 0.025402555503260912,\n \"acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.025402555503260912\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n \"acc_stderr\": 0.023287665127268552,\n \"acc_norm\": 0.7870967741935484,\n \"acc_norm_stderr\": 0.023287665127268552\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.03158415324047711,\n \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.03158415324047711\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7828282828282829,\n \"acc_stderr\": 0.029376616484945633,\n \"acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.029376616484945633\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.023381935348121437,\n \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.023381935348121437\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6128205128205129,\n \"acc_stderr\": 0.024697216930878934,\n \"acc_norm\": 0.6128205128205129,\n \"acc_norm_stderr\": 0.024697216930878934\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028597,\n \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028597\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.030283995525884396,\n \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.030283995525884396\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8311926605504587,\n \"acc_stderr\": 0.016060056268530336,\n \"acc_norm\": 0.8311926605504587,\n \"acc_norm_stderr\": 0.016060056268530336\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588667,\n \"acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588667\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n \"acc_stderr\": 0.030769352008229136,\n \"acc_norm\": 0.6995515695067265,\n \"acc_norm_stderr\": 0.030769352008229136\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.035477710041594654,\n \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.035477710041594654\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742179,\n \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742179\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n \"acc_stderr\": 0.022209309073165616,\n \"acc_norm\": 0.8675213675213675,\n \"acc_norm_stderr\": 0.022209309073165616\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n \"acc_stderr\": 0.01354741565866226,\n \"acc_norm\": 0.8263090676883781,\n \"acc_norm_stderr\": 0.01354741565866226\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7138728323699421,\n \"acc_stderr\": 0.02433214677913413,\n \"acc_norm\": 0.7138728323699421,\n \"acc_norm_stderr\": 0.02433214677913413\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.329608938547486,\n \"acc_stderr\": 0.01572153107518387,\n \"acc_norm\": 0.329608938547486,\n \"acc_norm_stderr\": 0.01572153107518387\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292452,\n \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292452\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n \"acc_stderr\": 0.02608270069539966,\n \"acc_norm\": 0.6977491961414791,\n \"acc_norm_stderr\": 0.02608270069539966\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.024288533637726095,\n \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.024288533637726095\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5070921985815603,\n \"acc_stderr\": 0.02982449855912901,\n \"acc_norm\": 0.5070921985815603,\n \"acc_norm_stderr\": 0.02982449855912901\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46740547588005216,\n \"acc_stderr\": 0.01274307294265334,\n \"acc_norm\": 0.46740547588005216,\n \"acc_norm_stderr\": 0.01274307294265334\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462937,\n \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462937\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6699346405228758,\n \"acc_stderr\": 0.019023726160724553,\n \"acc_norm\": 0.6699346405228758,\n \"acc_norm_stderr\": 0.019023726160724553\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8109452736318408,\n \"acc_stderr\": 0.027686913588013007,\n \"acc_norm\": 0.8109452736318408,\n \"acc_norm_stderr\": 0.027686913588013007\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.02796678585916089,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.02796678585916089\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3929008567931457,\n \"mc1_stderr\": 0.017097248285233065,\n \"mc2\": 0.5609459047030728,\n \"mc2_stderr\": 0.015392383178013528\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7813733228097869,\n \"acc_stderr\": 0.011616198215773234\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6194086429112965,\n \"acc_stderr\": 0.013373971277729817\n }\n}\n```", "repo_url": "https://huggingface.co/Yuma42/KangalKhan-Sapphire-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_15T11_23_46.531546", "path": ["**/details_harness|arc:challenge|25_2024-02-15T11-23-46.531546.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-15T11-23-46.531546.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_15T11_23_46.531546", "path": ["**/details_harness|gsm8k|5_2024-02-15T11-23-46.531546.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-15T11-23-46.531546.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_15T11_23_46.531546", "path": ["**/details_harness|hellaswag|10_2024-02-15T11-23-46.531546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-15T11-23-46.531546.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_15T11_23_46.531546", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-15T11-23-46.531546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-15T11-23-46.531546.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-15T11-23-46.531546.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_15T11_23_46.531546", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T11-23-46.531546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T11-23-46.531546.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_15T11_23_46.531546", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-15T11-23-46.531546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-15T11-23-46.531546.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_15T11_23_46.531546", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-15T11-23-46.531546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-15T11-23-46.531546.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_15T11_23_46.531546", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T11-23-46.531546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T11-23-46.531546.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_15T11_23_46.531546", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T11-23-46.531546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T11-23-46.531546.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_15T11_23_46.531546", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-15T11-23-46.531546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-15T11-23-46.531546.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_15T11_23_46.531546", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T11-23-46.531546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T11-23-46.531546.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_15T11_23_46.531546", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T11-23-46.531546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T11-23-46.531546.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_15T11_23_46.531546", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T11-23-46.531546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T11-23-46.531546.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_15T11_23_46.531546", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T11-23-46.531546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T11-23-46.531546.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_15T11_23_46.531546", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-15T11-23-46.531546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-15T11-23-46.531546.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_15T11_23_46.531546", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-15T11-23-46.531546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-15T11-23-46.531546.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_15T11_23_46.531546", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T11-23-46.531546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T11-23-46.531546.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_15T11_23_46.531546", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-15T11-23-46.531546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-15T11-23-46.531546.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_15T11_23_46.531546", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T11-23-46.531546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T11-23-46.531546.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_15T11_23_46.531546", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T11-23-46.531546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T11-23-46.531546.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_15T11_23_46.531546", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T11-23-46.531546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T11-23-46.531546.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_15T11_23_46.531546", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-15T11-23-46.531546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-15T11-23-46.531546.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_15T11_23_46.531546", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T11-23-46.531546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T11-23-46.531546.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_15T11_23_46.531546", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T11-23-46.531546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T11-23-46.531546.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_15T11_23_46.531546", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T11-23-46.531546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T11-23-46.531546.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_15T11_23_46.531546", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T11-23-46.531546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T11-23-46.531546.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_15T11_23_46.531546", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T11-23-46.531546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T11-23-46.531546.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_15T11_23_46.531546", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T11-23-46.531546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T11-23-46.531546.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_15T11_23_46.531546", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T11-23-46.531546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T11-23-46.531546.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_15T11_23_46.531546", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T11-23-46.531546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T11-23-46.531546.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_15T11_23_46.531546", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T11-23-46.531546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T11-23-46.531546.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_15T11_23_46.531546", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T11-23-46.531546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T11-23-46.531546.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_15T11_23_46.531546", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T11-23-46.531546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T11-23-46.531546.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_15T11_23_46.531546", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T11-23-46.531546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T11-23-46.531546.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_15T11_23_46.531546", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T11-23-46.531546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T11-23-46.531546.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_15T11_23_46.531546", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T11-23-46.531546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T11-23-46.531546.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_15T11_23_46.531546", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-15T11-23-46.531546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-15T11-23-46.531546.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_15T11_23_46.531546", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T11-23-46.531546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T11-23-46.531546.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_15T11_23_46.531546", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-15T11-23-46.531546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-15T11-23-46.531546.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_15T11_23_46.531546", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T11-23-46.531546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T11-23-46.531546.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_15T11_23_46.531546", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T11-23-46.531546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T11-23-46.531546.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_15T11_23_46.531546", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T11-23-46.531546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T11-23-46.531546.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_15T11_23_46.531546", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-15T11-23-46.531546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-15T11-23-46.531546.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_15T11_23_46.531546", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-15T11-23-46.531546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-15T11-23-46.531546.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_15T11_23_46.531546", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T11-23-46.531546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T11-23-46.531546.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_15T11_23_46.531546", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T11-23-46.531546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T11-23-46.531546.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_15T11_23_46.531546", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T11-23-46.531546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T11-23-46.531546.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_15T11_23_46.531546", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T11-23-46.531546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T11-23-46.531546.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_15T11_23_46.531546", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-15T11-23-46.531546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-15T11-23-46.531546.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_15T11_23_46.531546", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-15T11-23-46.531546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-15T11-23-46.531546.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_15T11_23_46.531546", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-15T11-23-46.531546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-15T11-23-46.531546.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_15T11_23_46.531546", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T11-23-46.531546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T11-23-46.531546.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_15T11_23_46.531546", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-15T11-23-46.531546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-15T11-23-46.531546.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_15T11_23_46.531546", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T11-23-46.531546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T11-23-46.531546.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_15T11_23_46.531546", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T11-23-46.531546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T11-23-46.531546.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_15T11_23_46.531546", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-15T11-23-46.531546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-15T11-23-46.531546.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_15T11_23_46.531546", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-15T11-23-46.531546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-15T11-23-46.531546.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_15T11_23_46.531546", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-15T11-23-46.531546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-15T11-23-46.531546.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_15T11_23_46.531546", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T11-23-46.531546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T11-23-46.531546.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_15T11_23_46.531546", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-15T11-23-46.531546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-15T11-23-46.531546.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_15T11_23_46.531546", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-15T11-23-46.531546.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-15T11-23-46.531546.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_15T11_23_46.531546", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-15T11-23-46.531546.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-15T11-23-46.531546.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_15T11_23_46.531546", "path": ["**/details_harness|winogrande|5_2024-02-15T11-23-46.531546.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-15T11-23-46.531546.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_15T11_23_46.531546", "path": ["results_2024-02-15T11-23-46.531546.parquet"]}, {"split": "latest", "path": ["results_2024-02-15T11-23-46.531546.parquet"]}]}]} | 2024-02-15T11:26:26+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Yuma42/KangalKhan-Sapphire-7B
Dataset automatically created during the evaluation run of model Yuma42/KangalKhan-Sapphire-7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-15T11:23:46.531546(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Yuma42/KangalKhan-Sapphire-7B\n\n\n\nDataset automatically created during the evaluation run of model Yuma42/KangalKhan-Sapphire-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-15T11:23:46.531546(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Yuma42/KangalKhan-Sapphire-7B\n\n\n\nDataset automatically created during the evaluation run of model Yuma42/KangalKhan-Sapphire-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-15T11:23:46.531546(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
4fc00b43973424d4ac00c399e9426e00b89fe5df |
# Dataset Card for Evaluation run of nlpguy/AlloyIngotNeoX
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [nlpguy/AlloyIngotNeoX](https://huggingface.co/nlpguy/AlloyIngotNeoX) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_nlpguy__AlloyIngotNeoX",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-15T11:28:14.890311](https://huggingface.co/datasets/open-llm-leaderboard/details_nlpguy__AlloyIngotNeoX/blob/main/results_2024-02-15T11-28-14.890311.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6559618494796027,
"acc_stderr": 0.03203002675451656,
"acc_norm": 0.6554016452842437,
"acc_norm_stderr": 0.03269903110679164,
"mc1": 0.6034271725826194,
"mc1_stderr": 0.017124930942023515,
"mc2": 0.7456883658583785,
"mc2_stderr": 0.014353519946726465
},
"harness|arc:challenge|25": {
"acc": 0.7158703071672355,
"acc_stderr": 0.013179442447653886,
"acc_norm": 0.7431740614334471,
"acc_norm_stderr": 0.0127669237941168
},
"harness|hellaswag|10": {
"acc": 0.7193786098386775,
"acc_stderr": 0.004483845735187827,
"acc_norm": 0.8906592312288388,
"acc_norm_stderr": 0.0031142850772280335
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.02815283794249386,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.02815283794249386
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.048580835742663454,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.048580835742663454
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108101,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108101
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.04697085136647863,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.04697085136647863
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.025279850397404907,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.025279850397404907
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5,
"acc_stderr": 0.04472135954999579,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04472135954999579
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7935483870967742,
"acc_stderr": 0.023025899617188712,
"acc_norm": 0.7935483870967742,
"acc_norm_stderr": 0.023025899617188712
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.03374402644139403,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.03374402644139403
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8080808080808081,
"acc_stderr": 0.028057791672989017,
"acc_norm": 0.8080808080808081,
"acc_norm_stderr": 0.028057791672989017
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.676923076923077,
"acc_stderr": 0.02371088850197057,
"acc_norm": 0.676923076923077,
"acc_norm_stderr": 0.02371088850197057
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.028661201116524565,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.028661201116524565
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.030388353551886793,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.030388353551886793
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.01563002297009244,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.01563002297009244
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.03407632093854051,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.03407632093854051
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.025195658428931792,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.025195658428931792
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601443,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601443
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8263090676883781,
"acc_stderr": 0.01354741565866226,
"acc_norm": 0.8263090676883781,
"acc_norm_stderr": 0.01354741565866226
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7369942196531792,
"acc_stderr": 0.023703099525258176,
"acc_norm": 0.7369942196531792,
"acc_norm_stderr": 0.023703099525258176
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4480446927374302,
"acc_stderr": 0.016631976628930595,
"acc_norm": 0.4480446927374302,
"acc_norm_stderr": 0.016631976628930595
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7320261437908496,
"acc_stderr": 0.025360603796242557,
"acc_norm": 0.7320261437908496,
"acc_norm_stderr": 0.025360603796242557
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.025670259242188936,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.025670259242188936
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.75,
"acc_stderr": 0.02409347123262133,
"acc_norm": 0.75,
"acc_norm_stderr": 0.02409347123262133
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47392438070404175,
"acc_stderr": 0.012752858346533127,
"acc_norm": 0.47392438070404175,
"acc_norm_stderr": 0.012752858346533127
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6911764705882353,
"acc_stderr": 0.02806499816704009,
"acc_norm": 0.6911764705882353,
"acc_norm_stderr": 0.02806499816704009
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6748366013071896,
"acc_stderr": 0.018950886770806315,
"acc_norm": 0.6748366013071896,
"acc_norm_stderr": 0.018950886770806315
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.02812342933514278,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.02812342933514278
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6034271725826194,
"mc1_stderr": 0.017124930942023515,
"mc2": 0.7456883658583785,
"mc2_stderr": 0.014353519946726465
},
"harness|winogrande|5": {
"acc": 0.8453038674033149,
"acc_stderr": 0.010163172650433533
},
"harness|gsm8k|5": {
"acc": 0.6982562547384382,
"acc_stderr": 0.012643544762873358
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_nlpguy__AlloyIngotNeoX | [
"region:us"
] | 2024-02-15T11:30:32+00:00 | {"pretty_name": "Evaluation run of nlpguy/AlloyIngotNeoX", "dataset_summary": "Dataset automatically created during the evaluation run of model [nlpguy/AlloyIngotNeoX](https://huggingface.co/nlpguy/AlloyIngotNeoX) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_nlpguy__AlloyIngotNeoX\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-15T11:28:14.890311](https://huggingface.co/datasets/open-llm-leaderboard/details_nlpguy__AlloyIngotNeoX/blob/main/results_2024-02-15T11-28-14.890311.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6559618494796027,\n \"acc_stderr\": 0.03203002675451656,\n \"acc_norm\": 0.6554016452842437,\n \"acc_norm_stderr\": 0.03269903110679164,\n \"mc1\": 0.6034271725826194,\n \"mc1_stderr\": 0.017124930942023515,\n \"mc2\": 0.7456883658583785,\n \"mc2_stderr\": 0.014353519946726465\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7158703071672355,\n \"acc_stderr\": 0.013179442447653886,\n \"acc_norm\": 0.7431740614334471,\n \"acc_norm_stderr\": 0.0127669237941168\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7193786098386775,\n \"acc_stderr\": 0.004483845735187827,\n \"acc_norm\": 0.8906592312288388,\n \"acc_norm_stderr\": 0.0031142850772280335\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.02815283794249386,\n \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.02815283794249386\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663454,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663454\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108101,\n \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108101\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.04697085136647863,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.04697085136647863\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.40476190476190477,\n \"acc_stderr\": 0.025279850397404907,\n \"acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.025279850397404907\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04472135954999579,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04472135954999579\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7935483870967742,\n \"acc_stderr\": 0.023025899617188712,\n \"acc_norm\": 0.7935483870967742,\n \"acc_norm_stderr\": 0.023025899617188712\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.03374402644139403,\n \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.03374402644139403\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8080808080808081,\n \"acc_stderr\": 0.028057791672989017,\n \"acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.028057791672989017\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.676923076923077,\n \"acc_stderr\": 0.02371088850197057,\n \"acc_norm\": 0.676923076923077,\n \"acc_norm_stderr\": 0.02371088850197057\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3296296296296296,\n \"acc_stderr\": 0.028661201116524565,\n \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.028661201116524565\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886793,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886793\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8422018348623853,\n \"acc_stderr\": 0.01563002297009244,\n \"acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.01563002297009244\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5185185185185185,\n \"acc_stderr\": 0.03407632093854051,\n \"acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.03407632093854051\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931792,\n \"acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931792\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601443,\n \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601443\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n \"acc_stderr\": 0.01354741565866226,\n \"acc_norm\": 0.8263090676883781,\n \"acc_norm_stderr\": 0.01354741565866226\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.023703099525258176,\n \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.023703099525258176\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4480446927374302,\n \"acc_stderr\": 0.016631976628930595,\n \"acc_norm\": 0.4480446927374302,\n \"acc_norm_stderr\": 0.016631976628930595\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7320261437908496,\n \"acc_stderr\": 0.025360603796242557,\n \"acc_norm\": 0.7320261437908496,\n \"acc_norm_stderr\": 0.025360603796242557\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n \"acc_stderr\": 0.025670259242188936,\n \"acc_norm\": 0.7138263665594855,\n \"acc_norm_stderr\": 0.025670259242188936\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47392438070404175,\n \"acc_stderr\": 0.012752858346533127,\n \"acc_norm\": 0.47392438070404175,\n \"acc_norm_stderr\": 0.012752858346533127\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6911764705882353,\n \"acc_stderr\": 0.02806499816704009,\n \"acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.02806499816704009\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6748366013071896,\n \"acc_stderr\": 0.018950886770806315,\n \"acc_norm\": 0.6748366013071896,\n \"acc_norm_stderr\": 0.018950886770806315\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.02812342933514278,\n \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.02812342933514278\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6034271725826194,\n \"mc1_stderr\": 0.017124930942023515,\n \"mc2\": 0.7456883658583785,\n \"mc2_stderr\": 0.014353519946726465\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8453038674033149,\n \"acc_stderr\": 0.010163172650433533\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6982562547384382,\n \"acc_stderr\": 0.012643544762873358\n }\n}\n```", "repo_url": "https://huggingface.co/nlpguy/AlloyIngotNeoX", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_15T11_28_14.890311", "path": ["**/details_harness|arc:challenge|25_2024-02-15T11-28-14.890311.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-15T11-28-14.890311.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_15T11_28_14.890311", "path": ["**/details_harness|gsm8k|5_2024-02-15T11-28-14.890311.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-15T11-28-14.890311.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_15T11_28_14.890311", "path": ["**/details_harness|hellaswag|10_2024-02-15T11-28-14.890311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-15T11-28-14.890311.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_15T11_28_14.890311", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-15T11-28-14.890311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-15T11-28-14.890311.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-15T11-28-14.890311.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_15T11_28_14.890311", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T11-28-14.890311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T11-28-14.890311.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_15T11_28_14.890311", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-15T11-28-14.890311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-15T11-28-14.890311.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_15T11_28_14.890311", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-15T11-28-14.890311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-15T11-28-14.890311.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_15T11_28_14.890311", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T11-28-14.890311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T11-28-14.890311.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_15T11_28_14.890311", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T11-28-14.890311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T11-28-14.890311.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_15T11_28_14.890311", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-15T11-28-14.890311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-15T11-28-14.890311.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_15T11_28_14.890311", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T11-28-14.890311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T11-28-14.890311.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_15T11_28_14.890311", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T11-28-14.890311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T11-28-14.890311.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_15T11_28_14.890311", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T11-28-14.890311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T11-28-14.890311.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_15T11_28_14.890311", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T11-28-14.890311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T11-28-14.890311.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_15T11_28_14.890311", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-15T11-28-14.890311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-15T11-28-14.890311.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_15T11_28_14.890311", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-15T11-28-14.890311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-15T11-28-14.890311.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_15T11_28_14.890311", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T11-28-14.890311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T11-28-14.890311.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_15T11_28_14.890311", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-15T11-28-14.890311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-15T11-28-14.890311.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_15T11_28_14.890311", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T11-28-14.890311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T11-28-14.890311.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_15T11_28_14.890311", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T11-28-14.890311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T11-28-14.890311.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_15T11_28_14.890311", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T11-28-14.890311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T11-28-14.890311.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_15T11_28_14.890311", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-15T11-28-14.890311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-15T11-28-14.890311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_15T11_28_14.890311", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T11-28-14.890311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T11-28-14.890311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_15T11_28_14.890311", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T11-28-14.890311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T11-28-14.890311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_15T11_28_14.890311", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T11-28-14.890311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T11-28-14.890311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_15T11_28_14.890311", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T11-28-14.890311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T11-28-14.890311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_15T11_28_14.890311", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T11-28-14.890311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T11-28-14.890311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_15T11_28_14.890311", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T11-28-14.890311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T11-28-14.890311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_15T11_28_14.890311", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T11-28-14.890311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T11-28-14.890311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_15T11_28_14.890311", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T11-28-14.890311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T11-28-14.890311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_15T11_28_14.890311", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T11-28-14.890311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T11-28-14.890311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_15T11_28_14.890311", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T11-28-14.890311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T11-28-14.890311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_15T11_28_14.890311", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T11-28-14.890311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T11-28-14.890311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_15T11_28_14.890311", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T11-28-14.890311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T11-28-14.890311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_15T11_28_14.890311", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T11-28-14.890311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T11-28-14.890311.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_15T11_28_14.890311", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T11-28-14.890311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T11-28-14.890311.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_15T11_28_14.890311", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-15T11-28-14.890311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-15T11-28-14.890311.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_15T11_28_14.890311", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T11-28-14.890311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T11-28-14.890311.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_15T11_28_14.890311", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-15T11-28-14.890311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-15T11-28-14.890311.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_15T11_28_14.890311", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T11-28-14.890311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T11-28-14.890311.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_15T11_28_14.890311", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T11-28-14.890311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T11-28-14.890311.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_15T11_28_14.890311", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T11-28-14.890311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T11-28-14.890311.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_15T11_28_14.890311", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-15T11-28-14.890311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-15T11-28-14.890311.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_15T11_28_14.890311", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-15T11-28-14.890311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-15T11-28-14.890311.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_15T11_28_14.890311", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T11-28-14.890311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T11-28-14.890311.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_15T11_28_14.890311", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T11-28-14.890311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T11-28-14.890311.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_15T11_28_14.890311", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T11-28-14.890311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T11-28-14.890311.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_15T11_28_14.890311", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T11-28-14.890311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T11-28-14.890311.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_15T11_28_14.890311", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-15T11-28-14.890311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-15T11-28-14.890311.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_15T11_28_14.890311", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-15T11-28-14.890311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-15T11-28-14.890311.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_15T11_28_14.890311", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-15T11-28-14.890311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-15T11-28-14.890311.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_15T11_28_14.890311", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T11-28-14.890311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T11-28-14.890311.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_15T11_28_14.890311", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-15T11-28-14.890311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-15T11-28-14.890311.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_15T11_28_14.890311", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T11-28-14.890311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T11-28-14.890311.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_15T11_28_14.890311", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T11-28-14.890311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T11-28-14.890311.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_15T11_28_14.890311", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-15T11-28-14.890311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-15T11-28-14.890311.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_15T11_28_14.890311", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-15T11-28-14.890311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-15T11-28-14.890311.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_15T11_28_14.890311", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-15T11-28-14.890311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-15T11-28-14.890311.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_15T11_28_14.890311", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T11-28-14.890311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T11-28-14.890311.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_15T11_28_14.890311", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-15T11-28-14.890311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-15T11-28-14.890311.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_15T11_28_14.890311", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-15T11-28-14.890311.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-15T11-28-14.890311.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_15T11_28_14.890311", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-15T11-28-14.890311.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-15T11-28-14.890311.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_15T11_28_14.890311", "path": ["**/details_harness|winogrande|5_2024-02-15T11-28-14.890311.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-15T11-28-14.890311.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_15T11_28_14.890311", "path": ["results_2024-02-15T11-28-14.890311.parquet"]}, {"split": "latest", "path": ["results_2024-02-15T11-28-14.890311.parquet"]}]}]} | 2024-02-15T11:30:53+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of nlpguy/AlloyIngotNeoX
Dataset automatically created during the evaluation run of model nlpguy/AlloyIngotNeoX on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-15T11:28:14.890311(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of nlpguy/AlloyIngotNeoX\n\n\n\nDataset automatically created during the evaluation run of model nlpguy/AlloyIngotNeoX on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-15T11:28:14.890311(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of nlpguy/AlloyIngotNeoX\n\n\n\nDataset automatically created during the evaluation run of model nlpguy/AlloyIngotNeoX on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-15T11:28:14.890311(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
b34160b8dfb5c33bcef1cb204b1c1e663e278cc8 | # Dataset Card for "RomancesTradicionales"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | paascorb/RomancesTradicionales | [
"region:us"
] | 2024-02-15T11:37:01+00:00 | {"dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "ground_truth", "dtype": "string"}, {"name": "contexts", "sequence": "string"}, {"name": "answer", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 6931, "num_examples": 5}], "download_size": 15535, "dataset_size": 6931}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-15T11:37:03+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "RomancesTradicionales"
More Information needed | [
"# Dataset Card for \"RomancesTradicionales\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"RomancesTradicionales\"\n\nMore Information needed"
] |
c5829914fc5895792c1005c07d7979425d0b884c | # Dataset Card for "wsd_myriade_synth_data_id_label_pc2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | gguichard/wsd_myriade_synth_data_id_label_pc2 | [
"region:us"
] | 2024-02-15T11:41:09+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "tokens", "sequence": "string"}, {"name": "wn_sens", "sequence": "int64"}, {"name": "input_ids", "sequence": "int32"}, {"name": "attention_mask", "sequence": "int8"}, {"name": "labels", "sequence": "float64"}], "splits": [{"name": "train", "num_bytes": 51147806.48548672, "num_examples": 91188}, {"name": "test", "num_bytes": 5683650.514513279, "num_examples": 10133}], "download_size": 14318575, "dataset_size": 56831457.0}} | 2024-02-15T11:41:17+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "wsd_myriade_synth_data_id_label_pc2"
More Information needed | [
"# Dataset Card for \"wsd_myriade_synth_data_id_label_pc2\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"wsd_myriade_synth_data_id_label_pc2\"\n\nMore Information needed"
] |
16fcbd04daff883de79b6972e0d0e31cf52d516a |
# Slim Orca(40K) Translated to Ukrainian 🇺🇦
## Dataset Description
A Ukrainian language dataset comprising 40,000+ records translated from the SlimOrca dataset.
This dataset is suitable for various natural language processing tasks.
Stay tuned for extended versions of dataset ;)
Слава Україні!
## Disclaimer
Prepare data before your usage. There are some errors in texts, so be carefull.
## How to Use
This dataset can be loaded using the Hugging Face Datasets library:
```python
from datasets import load_dataset
dataset = load_dataset('cidtd-mod-ua/slim-orca-40k-translated-ua')
```
# Citation
```bibtex
@misc{slim-orca-40k-translated-ua,
title = {slim-orca-40k-translated-ua - translation of SlimOrca},
author = {Center of Innovations and Defence Technologies Development of Ministry of Defence of Ukraine},
year = {2024},
publisher = {HuggingFace},
url = {https://huggingface.co/datasets/cidtd-mod-ua/slim-orca-40k-translated-ua}
}
```
# Citation from original SlimOrca
```bibtex
@misc{SlimOrca,
title = {SlimOrca: An Open Dataset of GPT-4 Augmented FLAN Reasoning Traces, with Verification},
author = {Wing Lian and Guan Wang and Bleys Goodson and Eugene Pentland and Austin Cook and Chanvichet Vong and "Teknium"},
year = {2023},
publisher = {HuggingFace},
url = {https://huggingface.co/Open-Orca/SlimOrca}
}
```
```bibtex
@misc{mukherjee2023orca,
title={Orca: Progressive Learning from Complex Explanation Traces of GPT-4},
author={Subhabrata Mukherjee and Arindam Mitra and Ganesh Jawahar and Sahaj Agarwal and Hamid Palangi and Ahmed Awadallah},
year={2023},
eprint={2306.02707},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
```bibtex
@misc{longpre2023flan,
title={The Flan Collection: Designing Data and Methods for Effective Instruction Tuning},
author={Shayne Longpre and Le Hou and Tu Vu and Albert Webson and Hyung Won Chung and Yi Tay and Denny Zhou and Quoc V. Le and Barret Zoph and Jason Wei and Adam Roberts},
year={2023},
eprint={2301.13688},
archivePrefix={arXiv},
primaryClass={cs.AI}
}
``` | cidtd-mod-ua/slim-orca-40k-translated-ua | [
"size_categories:10K<n<100K",
"language:uk",
"license:mit",
"arxiv:2306.02707",
"arxiv:2301.13688",
"region:us"
] | 2024-02-15T11:53:00+00:00 | {"language": ["uk"], "license": "mit", "size_categories": ["10K<n<100K"], "dataset_info": {"features": [{"name": "instruction", "dtype": "string"}, {"name": "input", "dtype": "string"}, {"name": "response", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 96484670, "num_examples": 40070}], "download_size": 46844641, "dataset_size": 96484670}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-15T14:24:01+00:00 | [
"2306.02707",
"2301.13688"
] | [
"uk"
] | TAGS
#size_categories-10K<n<100K #language-Ukrainian #license-mit #arxiv-2306.02707 #arxiv-2301.13688 #region-us
|
# Slim Orca(40K) Translated to Ukrainian 🇺🇦
## Dataset Description
A Ukrainian language dataset comprising 40,000+ records translated from the SlimOrca dataset.
This dataset is suitable for various natural language processing tasks.
Stay tuned for extended versions of dataset ;)
Слава Україні!
## Disclaimer
Prepare data before your usage. There are some errors in texts, so be carefull.
## How to Use
This dataset can be loaded using the Hugging Face Datasets library:
from original SlimOrca
| [
"# Slim Orca(40K) Translated to Ukrainian 🇺🇦",
"## Dataset Description\nA Ukrainian language dataset comprising 40,000+ records translated from the SlimOrca dataset. \n\nThis dataset is suitable for various natural language processing tasks.\n\nStay tuned for extended versions of dataset ;)\n\nСлава Україні!",
"## Disclaimer\nPrepare data before your usage. There are some errors in texts, so be carefull.",
"## How to Use\nThis dataset can be loaded using the Hugging Face Datasets library:\n\nfrom original SlimOrca"
] | [
"TAGS\n#size_categories-10K<n<100K #language-Ukrainian #license-mit #arxiv-2306.02707 #arxiv-2301.13688 #region-us \n",
"# Slim Orca(40K) Translated to Ukrainian 🇺🇦",
"## Dataset Description\nA Ukrainian language dataset comprising 40,000+ records translated from the SlimOrca dataset. \n\nThis dataset is suitable for various natural language processing tasks.\n\nStay tuned for extended versions of dataset ;)\n\nСлава Україні!",
"## Disclaimer\nPrepare data before your usage. There are some errors in texts, so be carefull.",
"## How to Use\nThis dataset can be loaded using the Hugging Face Datasets library:\n\nfrom original SlimOrca"
] |
1cc3960e954bf7c860b40f633a9c033ecdaf5e6b | # Dataset Card for "samantar1per_cent_merged_with_train_val"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | mlsquare/samantar1per_cent_merged_with_train_val | [
"region:us"
] | 2024-02-15T12:48:51+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "valid", "path": "data/valid-*"}]}], "dataset_info": {"features": [{"name": "tgt", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 10796418.786034288, "num_examples": 79638}, {"name": "valid", "num_bytes": 2486434.7348996587, "num_examples": 19909}], "download_size": 8504434, "dataset_size": 13282853.520933947}} | 2024-02-15T12:49:03+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "samantar1per_cent_merged_with_train_val"
More Information needed | [
"# Dataset Card for \"samantar1per_cent_merged_with_train_val\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"samantar1per_cent_merged_with_train_val\"\n\nMore Information needed"
] |
79056763f2726ec674ff087223c048fbad383097 | # Dataset Description
This dataset was used to evaluate different models on the masking exercise, measuring how well the different models can recover the original character.
## Dataset Overview
The dataset is compiled from the RefSeq database and other sources, focusing on ESKAPE pathogens. The genomic features were sampled randomly, followed by contiguous segmentation. This dataset contains various segments with lengths: [128, 256, 512, 1024]. The segments were randomly selected, and one of the characters was replaced by '*' (masked_segment column) to create a masking task. The reference_segment contains the original, non-replaced nucleotides. We performed 10,000 maskings per set, with a maximum of 2,000 genomic features. Only the genomic features: 'CDS', 'intergenic', 'pseudogene', and 'ncRNA' were considered.
## Data Fields
- `reference_segment_id`: A mapping of segment identifiers to their respective reference IDs in the database.
- `masked_segment`: The DNA sequence of the segment with certain positions masked (marked with '*') for prediction or testing purposes.
- `position_to_mask`: The specific position(s) in the sequence that have been masked, indicated by index numbers.
- `masked_segment_id`: Unique identifiers assigned to the masked segments. (unique only with respect to length)
- `contig_id`: Identifier of the contig to which the segment belongs.
- `segment_id`: Unique identifier for each genomic segment (same as reference segment id).
- `strand`: The DNA strand of the segment, indicated as '+' (positive) or '-' (negative).
- `seq_start`: Starting position of the segment within the contig.
- `seq_end`: Ending position of the segment within the contig.
- `segment_start`: Starting position of the genomic segment in the sequence.
- `segment_end`: Ending position of the genomic segment in the sequence.
- `label`: Category label for the genomic segment (e.g., 'CDS', 'intergenic').
- `segment_length`: The length of the genomic segment.
- `original_segment`: The original genomic sequence without any masking.
## Usage
This dataset is intended for academic and research purposes. Users are encouraged to use this dataset in the development and evaluation of bioinformatics models, especially those related to genomic studies.
## Contact Information
For any questions, feedback, or contributions regarding the datasets or ProkBERT, please feel free to reach out:
- **Name**: Balázs Ligeti
- **Email**: [email protected]
We welcome your input and collaboration to improve our resources and research.
## Citation
```bibtex
@Article{ProkBERT2024,
author = {Ligeti, Balázs et al.},
journal = {Frontiers in Microbiology},
title = {{ProkBERT} family: genomic language models},
year = {2024},
volume = {14},
URL = {https://www.frontiersin.org/articles/10.3389/fmicb.2023.1331233},
DOI = {10.3389/fmicb.2023.1331233}
}
| neuralbioinfo/ESKAPE-masking | [
"license:cc-by-nc-nd-4.0",
"region:us"
] | 2024-02-15T12:54:15+00:00 | {"license": "cc-by-nc-nd-4.0", "dataset_info": {"features": [{"name": "reference_segment_id", "dtype": "string"}, {"name": "masked_segment", "dtype": "string"}, {"name": "position_to_mask", "dtype": "int64"}, {"name": "masked_segment_id", "dtype": "int64"}, {"name": "contig_id", "dtype": "string"}, {"name": "segment_id", "dtype": "string"}, {"name": "strand", "dtype": "string"}, {"name": "seq_start", "dtype": "int64"}, {"name": "seq_end", "dtype": "int64"}, {"name": "segment_start", "dtype": "int64"}, {"name": "segment_end", "dtype": "int64"}, {"name": "label", "dtype": "string"}, {"name": "segment_length", "dtype": "int64"}, {"name": "original_segment", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 43505486, "num_examples": 40000}], "download_size": 19183244, "dataset_size": 43505486}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-15T13:36:29+00:00 | [] | [] | TAGS
#license-cc-by-nc-nd-4.0 #region-us
| # Dataset Description
This dataset was used to evaluate different models on the masking exercise, measuring how well the different models can recover the original character.
## Dataset Overview
The dataset is compiled from the RefSeq database and other sources, focusing on ESKAPE pathogens. The genomic features were sampled randomly, followed by contiguous segmentation. This dataset contains various segments with lengths: [128, 256, 512, 1024]. The segments were randomly selected, and one of the characters was replaced by '*' (masked_segment column) to create a masking task. The reference_segment contains the original, non-replaced nucleotides. We performed 10,000 maskings per set, with a maximum of 2,000 genomic features. Only the genomic features: 'CDS', 'intergenic', 'pseudogene', and 'ncRNA' were considered.
## Data Fields
- 'reference_segment_id': A mapping of segment identifiers to their respective reference IDs in the database.
- 'masked_segment': The DNA sequence of the segment with certain positions masked (marked with '*') for prediction or testing purposes.
- 'position_to_mask': The specific position(s) in the sequence that have been masked, indicated by index numbers.
- 'masked_segment_id': Unique identifiers assigned to the masked segments. (unique only with respect to length)
- 'contig_id': Identifier of the contig to which the segment belongs.
- 'segment_id': Unique identifier for each genomic segment (same as reference segment id).
- 'strand': The DNA strand of the segment, indicated as '+' (positive) or '-' (negative).
- 'seq_start': Starting position of the segment within the contig.
- 'seq_end': Ending position of the segment within the contig.
- 'segment_start': Starting position of the genomic segment in the sequence.
- 'segment_end': Ending position of the genomic segment in the sequence.
- 'label': Category label for the genomic segment (e.g., 'CDS', 'intergenic').
- 'segment_length': The length of the genomic segment.
- 'original_segment': The original genomic sequence without any masking.
## Usage
This dataset is intended for academic and research purposes. Users are encouraged to use this dataset in the development and evaluation of bioinformatics models, especially those related to genomic studies.
## Contact Information
For any questions, feedback, or contributions regarding the datasets or ProkBERT, please feel free to reach out:
- Name: Balázs Ligeti
- Email: obalasz@URL
We welcome your input and collaboration to improve our resources and research.
'''bibtex
@Article{ProkBERT2024,
author = {Ligeti, Balázs et al.},
journal = {Frontiers in Microbiology},
title = {{ProkBERT} family: genomic language models},
year = {2024},
volume = {14},
URL = {URL
DOI = {10.3389/fmicb.2023.1331233}
}
| [
"# Dataset Description\n\nThis dataset was used to evaluate different models on the masking exercise, measuring how well the different models can recover the original character.",
"## Dataset Overview\n\nThe dataset is compiled from the RefSeq database and other sources, focusing on ESKAPE pathogens. The genomic features were sampled randomly, followed by contiguous segmentation. This dataset contains various segments with lengths: [128, 256, 512, 1024]. The segments were randomly selected, and one of the characters was replaced by '*' (masked_segment column) to create a masking task. The reference_segment contains the original, non-replaced nucleotides. We performed 10,000 maskings per set, with a maximum of 2,000 genomic features. Only the genomic features: 'CDS', 'intergenic', 'pseudogene', and 'ncRNA' were considered.",
"## Data Fields\n\n- 'reference_segment_id': A mapping of segment identifiers to their respective reference IDs in the database.\n- 'masked_segment': The DNA sequence of the segment with certain positions masked (marked with '*') for prediction or testing purposes.\n- 'position_to_mask': The specific position(s) in the sequence that have been masked, indicated by index numbers.\n- 'masked_segment_id': Unique identifiers assigned to the masked segments. (unique only with respect to length)\n- 'contig_id': Identifier of the contig to which the segment belongs.\n- 'segment_id': Unique identifier for each genomic segment (same as reference segment id).\n- 'strand': The DNA strand of the segment, indicated as '+' (positive) or '-' (negative).\n- 'seq_start': Starting position of the segment within the contig.\n- 'seq_end': Ending position of the segment within the contig.\n- 'segment_start': Starting position of the genomic segment in the sequence.\n- 'segment_end': Ending position of the genomic segment in the sequence.\n- 'label': Category label for the genomic segment (e.g., 'CDS', 'intergenic').\n- 'segment_length': The length of the genomic segment.\n- 'original_segment': The original genomic sequence without any masking.",
"## Usage\n\nThis dataset is intended for academic and research purposes. Users are encouraged to use this dataset in the development and evaluation of bioinformatics models, especially those related to genomic studies.",
"## Contact Information\n\nFor any questions, feedback, or contributions regarding the datasets or ProkBERT, please feel free to reach out:\n\n- Name: Balázs Ligeti\n- Email: obalasz@URL\n\nWe welcome your input and collaboration to improve our resources and research.\n\n\n\n'''bibtex\n@Article{ProkBERT2024,\n author = {Ligeti, Balázs et al.},\n journal = {Frontiers in Microbiology},\n title = {{ProkBERT} family: genomic language models},\n year = {2024},\n volume = {14},\n URL = {URL\n DOI = {10.3389/fmicb.2023.1331233}\n}"
] | [
"TAGS\n#license-cc-by-nc-nd-4.0 #region-us \n",
"# Dataset Description\n\nThis dataset was used to evaluate different models on the masking exercise, measuring how well the different models can recover the original character.",
"## Dataset Overview\n\nThe dataset is compiled from the RefSeq database and other sources, focusing on ESKAPE pathogens. The genomic features were sampled randomly, followed by contiguous segmentation. This dataset contains various segments with lengths: [128, 256, 512, 1024]. The segments were randomly selected, and one of the characters was replaced by '*' (masked_segment column) to create a masking task. The reference_segment contains the original, non-replaced nucleotides. We performed 10,000 maskings per set, with a maximum of 2,000 genomic features. Only the genomic features: 'CDS', 'intergenic', 'pseudogene', and 'ncRNA' were considered.",
"## Data Fields\n\n- 'reference_segment_id': A mapping of segment identifiers to their respective reference IDs in the database.\n- 'masked_segment': The DNA sequence of the segment with certain positions masked (marked with '*') for prediction or testing purposes.\n- 'position_to_mask': The specific position(s) in the sequence that have been masked, indicated by index numbers.\n- 'masked_segment_id': Unique identifiers assigned to the masked segments. (unique only with respect to length)\n- 'contig_id': Identifier of the contig to which the segment belongs.\n- 'segment_id': Unique identifier for each genomic segment (same as reference segment id).\n- 'strand': The DNA strand of the segment, indicated as '+' (positive) or '-' (negative).\n- 'seq_start': Starting position of the segment within the contig.\n- 'seq_end': Ending position of the segment within the contig.\n- 'segment_start': Starting position of the genomic segment in the sequence.\n- 'segment_end': Ending position of the genomic segment in the sequence.\n- 'label': Category label for the genomic segment (e.g., 'CDS', 'intergenic').\n- 'segment_length': The length of the genomic segment.\n- 'original_segment': The original genomic sequence without any masking.",
"## Usage\n\nThis dataset is intended for academic and research purposes. Users are encouraged to use this dataset in the development and evaluation of bioinformatics models, especially those related to genomic studies.",
"## Contact Information\n\nFor any questions, feedback, or contributions regarding the datasets or ProkBERT, please feel free to reach out:\n\n- Name: Balázs Ligeti\n- Email: obalasz@URL\n\nWe welcome your input and collaboration to improve our resources and research.\n\n\n\n'''bibtex\n@Article{ProkBERT2024,\n author = {Ligeti, Balázs et al.},\n journal = {Frontiers in Microbiology},\n title = {{ProkBERT} family: genomic language models},\n year = {2024},\n volume = {14},\n URL = {URL\n DOI = {10.3389/fmicb.2023.1331233}\n}"
] |
1dd64f3ab985eb64f016b372cb414c6ca8ac6e19 | # Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset is the translation of the prompt injections from [Gandalf by Lakera](https://huggingface.co/datasets/Lakera/gandalf_ignore_instructions).
Using Google's translation API, the dataset was translated from English to Turkish.
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
This dataset could be used to train a classifier model to detect Prompt Jailbreaking.
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
The dataset may contain some offensive messages as no content filter was applied.
| brayene/tr_gandalf_ignore_instructions | [
"task_categories:text-classification",
"size_categories:n<1K",
"language:tr",
"language:en",
"region:us"
] | 2024-02-15T13:01:23+00:00 | {"language": ["tr", "en"], "size_categories": ["n<1K"], "task_categories": ["text-classification"], "dataset_info": {"features": [{"name": "text", "dtype": "string"}, {"name": "similarity", "dtype": "float64"}, {"name": "translation", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 130932, "num_examples": 777}, {"name": "test", "num_bytes": 18826, "num_examples": 112}, {"name": "validation", "num_bytes": 18853, "num_examples": 111}], "download_size": 93155, "dataset_size": 168611}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}, {"split": "validation", "path": "data/validation-*"}]}]} | 2024-02-15T13:12:18+00:00 | [] | [
"tr",
"en"
] | TAGS
#task_categories-text-classification #size_categories-n<1K #language-Turkish #language-English #region-us
| # Dataset Card for Dataset Name
This dataset is the translation of the prompt injections from Gandalf by Lakera.
Using Google's translation API, the dataset was translated from English to Turkish.
## Uses
This dataset could be used to train a classifier model to detect Prompt Jailbreaking.
## Bias, Risks, and Limitations
The dataset may contain some offensive messages as no content filter was applied.
| [
"# Dataset Card for Dataset Name\n\n\n\n\nThis dataset is the translation of the prompt injections from Gandalf by Lakera.\n\nUsing Google's translation API, the dataset was translated from English to Turkish.",
"## Uses\n\n\nThis dataset could be used to train a classifier model to detect Prompt Jailbreaking.",
"## Bias, Risks, and Limitations\n\n\n\nThe dataset may contain some offensive messages as no content filter was applied."
] | [
"TAGS\n#task_categories-text-classification #size_categories-n<1K #language-Turkish #language-English #region-us \n",
"# Dataset Card for Dataset Name\n\n\n\n\nThis dataset is the translation of the prompt injections from Gandalf by Lakera.\n\nUsing Google's translation API, the dataset was translated from English to Turkish.",
"## Uses\n\n\nThis dataset could be used to train a classifier model to detect Prompt Jailbreaking.",
"## Bias, Risks, and Limitations\n\n\n\nThe dataset may contain some offensive messages as no content filter was applied."
] |
bfd42a43ef6c23a69492d3c7038bc54687114bcd |
Lingustic features for 5 datasets.
UD relations: https://universaldependencies.org/ru/
Method for detecting reaction on frustration: http://dx.doi.org/10.1007/978-3-030-86855-0_2
Feature description:
feature --> description
punctuation_per_word --> Number of punctuation / Number of words
uppercase_rate --> Number of uppercase chars / Number of chars
mean_word_len --> Mean word lenght in chars
mean_sentence_len --> Mean sentence lenght in words
unique_words_rate --> Number of unique words / Number of words
verbs_1p_rate --> Number of first person verbs / Number of verbs
verbs_2p_rate --> Number of second person verbs / Number of verbs
verbs_3p_rate --> Number of third person verbs / Number of verbs
verbs_past_tense_rate --> Number of past tense verbs / Number of verbs
infinitives_rate --> Number of infinitive verbs / Number of verbs
pro_1p_rate --> Number of first person pronouns / Number of pronouns
pro_1p_sing_rate --> Number of first person singular pronouns / Number of pronouns
pro_1p_plural_rate --> Number of first person plural pronouns / Number of pronouns
pro_2p_rate --> Number of second person pronouns / Number of pronouns
pro_3p_rate --> Number of third person pronouns / Number of pronouns
trager_coef --> Number of verbs / Number of adjectives
logical_coh_coef --> (Number of conjunctions + Number of particles) / number of sentences * 3
verbs_per_nouns_coef --> Number of verbs / Number of nouns
participles_gerunds_coef --> Number of participles / Number of verbs
negation_rate --> Number of negative prefixes / Number of words
postag_A --> Number of A postags / Number of words
postag_ADV --> Number of ADV postags / Number of words
postag_ADVPRO --> Number of ADVPRO postags / Number of words
postag_ANUM --> Number of ANUM postags / Number of words
postag_APRO --> Number of APRO postags / Number of words
postag_COM --> Number of COM postags / Number of words
postag_CONJ --> Number of CONJ postags / Number of words
postag_INTJ --> Number of INTJ postags / Number of words
postag_NUM --> Number of NUM postags / Number of words
postag_PART --> Number of PART postags / Number of words
postag_PR --> Number of PR postags / Number of words
postag_S --> Number of S postags / Number of words
postag_SPRO --> Number of SPRO postags / Number of words
postag_V --> Number of V postags / Number of words
tgw_positive_assessment --> Dictionary: words related to positive assessment
tgw_positive_social --> Dictionary: words related to positive sociality
tgw_positive_emotions --> Dictionary: words related to positive emotions
tgw_negative_assessment --> Dictionary: words related to negative assessment
tgw_negative_social --> Dictionary: words related to negative sociality
tgw_negative_emotions --> Dictionary: words related to negative emotions
tgw_motivation_activity --> Dictionary: words related to motivation, activity and tension
tgw_cognitive_communication --> Dictionary: words related to cognitive activity and communication
tgw_destructive_activity --> Dictionary: words related to destructive activity
tgw_affect_lex --> Dictionary: affectogenic language
tgw_bodily_states_emotions --> Dictionary: words related to negative and passive emotions and bodily states
tgw_invectives --> Dictionary: invectives
tgw_soft_invectives --> Dictionary: soft invectives
tgw_obscene_lex --> Dictionary: obscene lexicon
tgw_youth_jargon --> Dictionary: youth jargon
tgw_hcs --> Dictionary: words related to housing and communal services
tgw_economics --> Dictionary: words related to exonomics
tgw_catastrophes --> Dictionary: words related to catastrophes
tgw_security_structures --> Dictionary: words related to security structures
tgw_healthcare_demography_ecology --> Dictionary: words related to healthcare, demography and ecology
tgw_authority --> Dictionary: words related to authority
be_disgust --> Dictionary: basic emotions of disgust
be_shame --> Dictionary: basic emotions of shame
be_anger --> Dictionary: basic emotions of anger
be_fear --> Dictionary: basic emotions of fear
be_sadness --> Dictionary: basic emotions of sadness
be_calm_excitement --> Dictionary: basic emotions of calm and excitement
be_happyness --> Dictionary: basic emotions of happyness
be_wonder --> Dictionary: basic emotions of wonder
ew_positive --> Dictionary: positive emotives
ew_negative --> Dictionary: negative emotives
ew_ambivalent --> Dictionary: ambivalent emotives
ew_de_emotives --> Dictionary: deemotives
sentiment_rate --> Sentiment score based on linis-crowd dictionary
max_synt_tree --> Max syntax tree lenght
min_synt_tree --> Min syntax tree lenght
mean_synt_tree --> Mean syntax tree lenght
flat:foreign: --> Number of UD relations normilized by Number of words
csubj --> Number of UD relations normilized by Number of words
acl --> Number of UD relations normilized by Number of words
acl:relcl --> Number of UD relations normilized by Number of words
advcl --> Number of UD relations normilized by Number of words
advmod --> Number of UD relations normilized by Number of words
amod --> Number of UD relations normilized by Number of words
appos --> Number of UD relations normilized by Number of words
aux --> Number of UD relations normilized by Number of words
aux:pass --> Number of UD relations normilized by Number of words
case --> Number of UD relations normilized by Number of words
cc --> Number of UD relations normilized by Number of words
cc:preconj --> Number of UD relations normilized by Number of words
ccomp --> Number of UD relations normilized by Number of words
conj --> Number of UD relations normilized by Number of words
cop --> Number of UD relations normilized by Number of words
det --> Number of UD relations normilized by Number of words
discourse --> Number of UD relations normilized by Number of words
fixed --> Number of UD relations normilized by Number of words
flat --> Number of UD relations normilized by Number of words
goeswith --> Number of UD relations normilized by Number of words
iobj --> Number of UD relations normilized by Number of words
list --> Number of UD relations normilized by Number of words
mark --> Number of UD relations normilized by Number of words
nmod --> Number of UD relations normilized by Number of words
nsubj --> Number of UD relations normilized by Number of words
nsubj:pass --> Number of UD relations normilized by Number of words
nummod --> Number of UD relations normilized by Number of words
nummod:gov --> Number of UD relations normilized by Number of words
obj --> Number of UD relations normilized by Number of words
obl --> Number of UD relations normilized by Number of words
orphan --> Number of UD relations normilized by Number of words
parataxis --> Number of UD relations normilized by Number of words
punct --> Number of UD relations normilized by Number of words
root --> Number of UD relations normilized by Number of words
xcomp --> Number of UD relations normilized by Number of words
compound --> Number of UD relations normilized by Number of words
flat:foreign --> Number of UD relations normilized by Number of words
E_group --> Reaction on frustration: E type
M_group --> Reaction on frustration: M type
I_group --> Reaction on frustration: I type
inf_group --> Reaction on frustration: no reaction | anonymizedauthor/paper_data | [
"license:cc-by-nc-sa-4.0",
"region:us"
] | 2024-02-15T13:59:58+00:00 | {"license": "cc-by-nc-sa-4.0"} | 2024-02-15T20:22:13+00:00 | [] | [] | TAGS
#license-cc-by-nc-sa-4.0 #region-us
|
Lingustic features for 5 datasets.
UD relations: URL
Method for detecting reaction on frustration: URL
Feature description:
feature --> description
punctuation_per_word --> Number of punctuation / Number of words
uppercase_rate --> Number of uppercase chars / Number of chars
mean_word_len --> Mean word lenght in chars
mean_sentence_len --> Mean sentence lenght in words
unique_words_rate --> Number of unique words / Number of words
verbs_1p_rate --> Number of first person verbs / Number of verbs
verbs_2p_rate --> Number of second person verbs / Number of verbs
verbs_3p_rate --> Number of third person verbs / Number of verbs
verbs_past_tense_rate --> Number of past tense verbs / Number of verbs
infinitives_rate --> Number of infinitive verbs / Number of verbs
pro_1p_rate --> Number of first person pronouns / Number of pronouns
pro_1p_sing_rate --> Number of first person singular pronouns / Number of pronouns
pro_1p_plural_rate --> Number of first person plural pronouns / Number of pronouns
pro_2p_rate --> Number of second person pronouns / Number of pronouns
pro_3p_rate --> Number of third person pronouns / Number of pronouns
trager_coef --> Number of verbs / Number of adjectives
logical_coh_coef --> (Number of conjunctions + Number of particles) / number of sentences * 3
verbs_per_nouns_coef --> Number of verbs / Number of nouns
participles_gerunds_coef --> Number of participles / Number of verbs
negation_rate --> Number of negative prefixes / Number of words
postag_A --> Number of A postags / Number of words
postag_ADV --> Number of ADV postags / Number of words
postag_ADVPRO --> Number of ADVPRO postags / Number of words
postag_ANUM --> Number of ANUM postags / Number of words
postag_APRO --> Number of APRO postags / Number of words
postag_COM --> Number of COM postags / Number of words
postag_CONJ --> Number of CONJ postags / Number of words
postag_INTJ --> Number of INTJ postags / Number of words
postag_NUM --> Number of NUM postags / Number of words
postag_PART --> Number of PART postags / Number of words
postag_PR --> Number of PR postags / Number of words
postag_S --> Number of S postags / Number of words
postag_SPRO --> Number of SPRO postags / Number of words
postag_V --> Number of V postags / Number of words
tgw_positive_assessment --> Dictionary: words related to positive assessment
tgw_positive_social --> Dictionary: words related to positive sociality
tgw_positive_emotions --> Dictionary: words related to positive emotions
tgw_negative_assessment --> Dictionary: words related to negative assessment
tgw_negative_social --> Dictionary: words related to negative sociality
tgw_negative_emotions --> Dictionary: words related to negative emotions
tgw_motivation_activity --> Dictionary: words related to motivation, activity and tension
tgw_cognitive_communication --> Dictionary: words related to cognitive activity and communication
tgw_destructive_activity --> Dictionary: words related to destructive activity
tgw_affect_lex --> Dictionary: affectogenic language
tgw_bodily_states_emotions --> Dictionary: words related to negative and passive emotions and bodily states
tgw_invectives --> Dictionary: invectives
tgw_soft_invectives --> Dictionary: soft invectives
tgw_obscene_lex --> Dictionary: obscene lexicon
tgw_youth_jargon --> Dictionary: youth jargon
tgw_hcs --> Dictionary: words related to housing and communal services
tgw_economics --> Dictionary: words related to exonomics
tgw_catastrophes --> Dictionary: words related to catastrophes
tgw_security_structures --> Dictionary: words related to security structures
tgw_healthcare_demography_ecology --> Dictionary: words related to healthcare, demography and ecology
tgw_authority --> Dictionary: words related to authority
be_disgust --> Dictionary: basic emotions of disgust
be_shame --> Dictionary: basic emotions of shame
be_anger --> Dictionary: basic emotions of anger
be_fear --> Dictionary: basic emotions of fear
be_sadness --> Dictionary: basic emotions of sadness
be_calm_excitement --> Dictionary: basic emotions of calm and excitement
be_happyness --> Dictionary: basic emotions of happyness
be_wonder --> Dictionary: basic emotions of wonder
ew_positive --> Dictionary: positive emotives
ew_negative --> Dictionary: negative emotives
ew_ambivalent --> Dictionary: ambivalent emotives
ew_de_emotives --> Dictionary: deemotives
sentiment_rate --> Sentiment score based on linis-crowd dictionary
max_synt_tree --> Max syntax tree lenght
min_synt_tree --> Min syntax tree lenght
mean_synt_tree --> Mean syntax tree lenght
flat:foreign: --> Number of UD relations normilized by Number of words
csubj --> Number of UD relations normilized by Number of words
acl --> Number of UD relations normilized by Number of words
acl:relcl --> Number of UD relations normilized by Number of words
advcl --> Number of UD relations normilized by Number of words
advmod --> Number of UD relations normilized by Number of words
amod --> Number of UD relations normilized by Number of words
appos --> Number of UD relations normilized by Number of words
aux --> Number of UD relations normilized by Number of words
aux:pass --> Number of UD relations normilized by Number of words
case --> Number of UD relations normilized by Number of words
cc --> Number of UD relations normilized by Number of words
cc:preconj --> Number of UD relations normilized by Number of words
ccomp --> Number of UD relations normilized by Number of words
conj --> Number of UD relations normilized by Number of words
cop --> Number of UD relations normilized by Number of words
det --> Number of UD relations normilized by Number of words
discourse --> Number of UD relations normilized by Number of words
fixed --> Number of UD relations normilized by Number of words
flat --> Number of UD relations normilized by Number of words
goeswith --> Number of UD relations normilized by Number of words
iobj --> Number of UD relations normilized by Number of words
list --> Number of UD relations normilized by Number of words
mark --> Number of UD relations normilized by Number of words
nmod --> Number of UD relations normilized by Number of words
nsubj --> Number of UD relations normilized by Number of words
nsubj:pass --> Number of UD relations normilized by Number of words
nummod --> Number of UD relations normilized by Number of words
nummod:gov --> Number of UD relations normilized by Number of words
obj --> Number of UD relations normilized by Number of words
obl --> Number of UD relations normilized by Number of words
orphan --> Number of UD relations normilized by Number of words
parataxis --> Number of UD relations normilized by Number of words
punct --> Number of UD relations normilized by Number of words
root --> Number of UD relations normilized by Number of words
xcomp --> Number of UD relations normilized by Number of words
compound --> Number of UD relations normilized by Number of words
flat:foreign --> Number of UD relations normilized by Number of words
E_group --> Reaction on frustration: E type
M_group --> Reaction on frustration: M type
I_group --> Reaction on frustration: I type
inf_group --> Reaction on frustration: no reaction | [] | [
"TAGS\n#license-cc-by-nc-sa-4.0 #region-us \n"
] |
d80206ad5803c8fa5f885d0451b1afee20041c1a |
The dataset consists of 10000 jpg images with white backgrounds, 10000 jpg images with colored backgrounds (the same colors used in the paper) as well as 3x10000 json annotation files. The images are generated from 50 different templates.
https://zenodo.org/records/10371464
---
dataset_info:
features:
- name: image
dtype: image
- name: ner_tags
sequence: int64
- name: words
sequence: string
- name: bboxes
sequence:
sequence: int64
splits:
- name: train
num_bytes: 477503369.0
num_examples: 10000
download_size: 342662174
dataset_size: 477503369.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
@misc{limam2023fatura, title={FATURA: A Multi-Layout Invoice Image Dataset for Document Analysis and Understanding}, author={Mahmoud Limam and Marwa Dhiaf and Yousri Kessentini}, year={2023}, eprint={2311.11856}, archivePrefix={arXiv}, primaryClass={cs.CV} } | mathieu1256/FATURA2-invoices | [
"task_categories:text-classification",
"size_categories:1K<n<10K",
"language:en",
"license:cc-by-4.0",
"invoices",
"data extraction",
"invoice",
"FATURA2",
"arxiv:2311.11856",
"region:us"
] | 2024-02-15T14:13:47+00:00 | {"language": ["en"], "license": "cc-by-4.0", "size_categories": ["1K<n<10K"], "task_categories": ["text-classification"], "pretty_name": "FATURA 2 invoices", "tags": ["invoices", "data extraction", "invoice", "FATURA2"], "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "ner_tags", "sequence": "int64"}, {"name": "bboxes", "sequence": {"sequence": "int64"}}, {"name": "tokens", "sequence": "string"}, {"name": "id", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 411874484.6, "num_examples": 8600}, {"name": "test", "num_bytes": 60569760.6, "num_examples": 1400}], "download_size": 342750666, "dataset_size": 472444245.20000005}} | 2024-02-15T15:59:26+00:00 | [
"2311.11856"
] | [
"en"
] | TAGS
#task_categories-text-classification #size_categories-1K<n<10K #language-English #license-cc-by-4.0 #invoices #data extraction #invoice #FATURA2 #arxiv-2311.11856 #region-us
|
The dataset consists of 10000 jpg images with white backgrounds, 10000 jpg images with colored backgrounds (the same colors used in the paper) as well as 3x10000 json annotation files. The images are generated from 50 different templates.
URL
---
dataset_info:
features:
- name: image
dtype: image
- name: ner_tags
sequence: int64
- name: words
sequence: string
- name: bboxes
sequence:
sequence: int64
splits:
- name: train
num_bytes: 477503369.0
num_examples: 10000
download_size: 342662174
dataset_size: 477503369.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
@misc{limam2023fatura, title={FATURA: A Multi-Layout Invoice Image Dataset for Document Analysis and Understanding}, author={Mahmoud Limam and Marwa Dhiaf and Yousri Kessentini}, year={2023}, eprint={2311.11856}, archivePrefix={arXiv}, primaryClass={cs.CV} } | [] | [
"TAGS\n#task_categories-text-classification #size_categories-1K<n<10K #language-English #license-cc-by-4.0 #invoices #data extraction #invoice #FATURA2 #arxiv-2311.11856 #region-us \n"
] |
74158d9708477181e7c83b606780e7fd4124e27a |
# Dataset Card for "danish-citizen-tests"
## Dataset Description
- **Point of Contact:** [Dan Saattrup Nielsen](mailto:[email protected])
- **Size of dataset:** 126 KB
- **Repository:** https://gist.github.com/saattrupdan/91c3fd53ceae252dd54439b45736c2e0
### Dataset Summary
This dataset contains tests for citizenship ("indfødsretsprøven") and permanent residence ("medborgerskabsprøven") in Denmark, from the years 2016-2023.
### Languages
The dataset is available in Danish (`da`).
## Dataset Structure
An example from the dataset looks as follows.
```
{
'question': 'Må en dommer bære religiøse symboler i en retssal i Danmark?',
'option_a': 'Ja',
'option_b': 'Nej',
'option_c': None,
'answer': 'B',
'test_type': 'indfødsretsprøven',
'year': 2020,
'version': 'summer',
'question_id': 1
}
```
### Data Fields
- `question`: a `string` feature.
- `option_a`: a `string` feature.
- `option_b`: a `string` feature.
- `option_c`: a `string` feature.
- `answer`: a `string` feature.
- `test_type`: a `string` feature.
- `year`: an `int64` feature.
- `version`: a `string` feature.
- `question_id`: an `int64` feature.
## Dataset Creation
### Curation Rationale
There is not a publicly available dataset testing the knowledge about the Danish society.
### Source Data
These tests are all available as PDFs [at this https URL](https://danskogproever.dk/), and extracted using [this Python script](https://gist.github.com/saattrupdan/91c3fd53ceae252dd54439b45736c2e0).
## Additional Information
### Dataset Curators
[Dan Saattrup Nielsen](https://huggingface.co/saattrupdan) from the [The Alexandra
Institute](https://alexandra.dk/)
### Licensing Information
The dataset is licensed under the [CC0
license](https://creativecommons.org/share-your-work/public-domain/cc0/). | alexandrainst/danish-citizen-tests | [
"size_categories:n<1K",
"language:da",
"license:cc0-1.0",
"region:us"
] | 2024-02-15T14:29:48+00:00 | {"language": ["da"], "license": "cc0-1.0", "size_categories": ["n<1K"], "dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "option_a", "dtype": "string"}, {"name": "option_b", "dtype": "string"}, {"name": "option_c", "dtype": "string"}, {"name": "answer", "dtype": "string"}, {"name": "test_type", "dtype": "string"}, {"name": "year", "dtype": "int64"}, {"name": "version", "dtype": "string"}, {"name": "question_id", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 125902, "num_examples": 720}], "download_size": 48325, "dataset_size": 125902}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-15T14:55:20+00:00 | [] | [
"da"
] | TAGS
#size_categories-n<1K #language-Danish #license-cc0-1.0 #region-us
|
# Dataset Card for "danish-citizen-tests"
## Dataset Description
- Point of Contact: Dan Saattrup Nielsen
- Size of dataset: 126 KB
- Repository: URL
### Dataset Summary
This dataset contains tests for citizenship ("indfødsretsprøven") and permanent residence ("medborgerskabsprøven") in Denmark, from the years 2016-2023.
### Languages
The dataset is available in Danish ('da').
## Dataset Structure
An example from the dataset looks as follows.
### Data Fields
- 'question': a 'string' feature.
- 'option_a': a 'string' feature.
- 'option_b': a 'string' feature.
- 'option_c': a 'string' feature.
- 'answer': a 'string' feature.
- 'test_type': a 'string' feature.
- 'year': an 'int64' feature.
- 'version': a 'string' feature.
- 'question_id': an 'int64' feature.
## Dataset Creation
### Curation Rationale
There is not a publicly available dataset testing the knowledge about the Danish society.
### Source Data
These tests are all available as PDFs at this https URL, and extracted using this Python script.
## Additional Information
### Dataset Curators
Dan Saattrup Nielsen from the The Alexandra
Institute
### Licensing Information
The dataset is licensed under the CC0
license. | [
"# Dataset Card for \"danish-citizen-tests\"",
"## Dataset Description\n\n- Point of Contact: Dan Saattrup Nielsen\n- Size of dataset: 126 KB\n- Repository: URL",
"### Dataset Summary\n\nThis dataset contains tests for citizenship (\"indfødsretsprøven\") and permanent residence (\"medborgerskabsprøven\") in Denmark, from the years 2016-2023.",
"### Languages\n\nThe dataset is available in Danish ('da').",
"## Dataset Structure\n\nAn example from the dataset looks as follows.",
"### Data Fields\n\n- 'question': a 'string' feature.\n- 'option_a': a 'string' feature.\n- 'option_b': a 'string' feature.\n- 'option_c': a 'string' feature.\n- 'answer': a 'string' feature.\n- 'test_type': a 'string' feature.\n- 'year': an 'int64' feature.\n- 'version': a 'string' feature.\n- 'question_id': an 'int64' feature.",
"## Dataset Creation",
"### Curation Rationale\n\nThere is not a publicly available dataset testing the knowledge about the Danish society.",
"### Source Data\n\nThese tests are all available as PDFs at this https URL, and extracted using this Python script.",
"## Additional Information",
"### Dataset Curators\n\nDan Saattrup Nielsen from the The Alexandra\nInstitute",
"### Licensing Information\n\nThe dataset is licensed under the CC0\nlicense."
] | [
"TAGS\n#size_categories-n<1K #language-Danish #license-cc0-1.0 #region-us \n",
"# Dataset Card for \"danish-citizen-tests\"",
"## Dataset Description\n\n- Point of Contact: Dan Saattrup Nielsen\n- Size of dataset: 126 KB\n- Repository: URL",
"### Dataset Summary\n\nThis dataset contains tests for citizenship (\"indfødsretsprøven\") and permanent residence (\"medborgerskabsprøven\") in Denmark, from the years 2016-2023.",
"### Languages\n\nThe dataset is available in Danish ('da').",
"## Dataset Structure\n\nAn example from the dataset looks as follows.",
"### Data Fields\n\n- 'question': a 'string' feature.\n- 'option_a': a 'string' feature.\n- 'option_b': a 'string' feature.\n- 'option_c': a 'string' feature.\n- 'answer': a 'string' feature.\n- 'test_type': a 'string' feature.\n- 'year': an 'int64' feature.\n- 'version': a 'string' feature.\n- 'question_id': an 'int64' feature.",
"## Dataset Creation",
"### Curation Rationale\n\nThere is not a publicly available dataset testing the knowledge about the Danish society.",
"### Source Data\n\nThese tests are all available as PDFs at this https URL, and extracted using this Python script.",
"## Additional Information",
"### Dataset Curators\n\nDan Saattrup Nielsen from the The Alexandra\nInstitute",
"### Licensing Information\n\nThe dataset is licensed under the CC0\nlicense."
] |
52ab73009ece3482f590a0449e05e78c7540ead9 | # Dataset Card for "QUIZBOT_AI_DATASET"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | Sujithanumala/QUIZBOT_AI_DATASET | [
"region:us"
] | 2024-02-15T15:07:52+00:00 | {"dataset_info": {"features": [{"name": "input_ids", "dtype": "string"}, {"name": "labels", "dtype": "string"}], "splits": [{"name": "Train", "num_bytes": 23599038, "num_examples": 23081}, {"name": "Test", "num_bytes": 934809, "num_examples": 1091}], "download_size": 5191350, "dataset_size": 24533847}} | 2024-02-15T15:08:00+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "QUIZBOT_AI_DATASET"
More Information needed | [
"# Dataset Card for \"QUIZBOT_AI_DATASET\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"QUIZBOT_AI_DATASET\"\n\nMore Information needed"
] |
67a52dfab06c8ca45edcbe5e717eeaad276e8622 |
# Dataset Card for Evaluation run of abhishekchohan/SOLAR-10.7B-Instruct-Forest-DPO-v1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [abhishekchohan/SOLAR-10.7B-Instruct-Forest-DPO-v1](https://huggingface.co/abhishekchohan/SOLAR-10.7B-Instruct-Forest-DPO-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_abhishekchohan__SOLAR-10.7B-Instruct-Forest-DPO-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-15T16:19:12.135901](https://huggingface.co/datasets/open-llm-leaderboard/details_abhishekchohan__SOLAR-10.7B-Instruct-Forest-DPO-v1/blob/main/results_2024-02-15T16-19-12.135901.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6601535530792063,
"acc_stderr": 0.03180088010860255,
"acc_norm": 0.6612267563136458,
"acc_norm_stderr": 0.03244397413229155,
"mc1": 0.6083231334149327,
"mc1_stderr": 0.017087795881769636,
"mc2": 0.7613409869233712,
"mc2_stderr": 0.014044498113643967
},
"harness|arc:challenge|25": {
"acc": 0.6936860068259386,
"acc_stderr": 0.013470584417276511,
"acc_norm": 0.7192832764505119,
"acc_norm_stderr": 0.013131238126975583
},
"harness|hellaswag|10": {
"acc": 0.7001593308105954,
"acc_stderr": 0.004572515919210697,
"acc_norm": 0.8843855805616411,
"acc_norm_stderr": 0.0031910847927931543
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353227,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353227
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7368421052631579,
"acc_stderr": 0.03583496176361073,
"acc_norm": 0.7368421052631579,
"acc_norm_stderr": 0.03583496176361073
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.02863723563980089,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.02863723563980089
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6297872340425532,
"acc_stderr": 0.03156564682236785,
"acc_norm": 0.6297872340425532,
"acc_norm_stderr": 0.03156564682236785
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.047028804320496165,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.047028804320496165
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6,
"acc_stderr": 0.040824829046386284,
"acc_norm": 0.6,
"acc_norm_stderr": 0.040824829046386284
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4708994708994709,
"acc_stderr": 0.02570765861415496,
"acc_norm": 0.4708994708994709,
"acc_norm_stderr": 0.02570765861415496
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.023287665127268545,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.023287665127268545
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.0351760354036101,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.0351760354036101
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8121212121212121,
"acc_stderr": 0.03050193405942914,
"acc_norm": 0.8121212121212121,
"acc_norm_stderr": 0.03050193405942914
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8535353535353535,
"acc_stderr": 0.025190921114603915,
"acc_norm": 0.8535353535353535,
"acc_norm_stderr": 0.025190921114603915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328973,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328973
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6564102564102564,
"acc_stderr": 0.02407869658063547,
"acc_norm": 0.6564102564102564,
"acc_norm_stderr": 0.02407869658063547
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3592592592592593,
"acc_stderr": 0.029252905927251976,
"acc_norm": 0.3592592592592593,
"acc_norm_stderr": 0.029252905927251976
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.03006676158297793,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.03006676158297793
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3841059602649007,
"acc_stderr": 0.03971301814719197,
"acc_norm": 0.3841059602649007,
"acc_norm_stderr": 0.03971301814719197
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8403669724770643,
"acc_stderr": 0.015703498348461777,
"acc_norm": 0.8403669724770643,
"acc_norm_stderr": 0.015703498348461777
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5787037037037037,
"acc_stderr": 0.033674621388960775,
"acc_norm": 0.5787037037037037,
"acc_norm_stderr": 0.033674621388960775
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8578431372549019,
"acc_stderr": 0.02450980392156862,
"acc_norm": 0.8578431372549019,
"acc_norm_stderr": 0.02450980392156862
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8354430379746836,
"acc_stderr": 0.024135736240566932,
"acc_norm": 0.8354430379746836,
"acc_norm_stderr": 0.024135736240566932
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.031024411740572213,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.031024411740572213
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7404580152671756,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.7404580152671756,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.03520893951097654,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.03520893951097654
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7423312883435583,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.7423312883435583,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8349514563106796,
"acc_stderr": 0.03675668832233188,
"acc_norm": 0.8349514563106796,
"acc_norm_stderr": 0.03675668832233188
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.022801382534597524,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.022801382534597524
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7969348659003831,
"acc_stderr": 0.014385525076611573,
"acc_norm": 0.7969348659003831,
"acc_norm_stderr": 0.014385525076611573
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.02394851290546837,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.02394851290546837
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4301675977653631,
"acc_stderr": 0.01655860163604104,
"acc_norm": 0.4301675977653631,
"acc_norm_stderr": 0.01655860163604104
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7483660130718954,
"acc_stderr": 0.024848018263875192,
"acc_norm": 0.7483660130718954,
"acc_norm_stderr": 0.024848018263875192
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.025670259242188936,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.025670259242188936
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.02240967454730418,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.02240967454730418
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5283687943262412,
"acc_stderr": 0.02977945095730307,
"acc_norm": 0.5283687943262412,
"acc_norm_stderr": 0.02977945095730307
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5013037809647979,
"acc_stderr": 0.012770192691057109,
"acc_norm": 0.5013037809647979,
"acc_norm_stderr": 0.012770192691057109
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7426470588235294,
"acc_stderr": 0.026556519470041513,
"acc_norm": 0.7426470588235294,
"acc_norm_stderr": 0.026556519470041513
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6879084967320261,
"acc_stderr": 0.01874501120127766,
"acc_norm": 0.6879084967320261,
"acc_norm_stderr": 0.01874501120127766
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7510204081632653,
"acc_stderr": 0.027682979522960238,
"acc_norm": 0.7510204081632653,
"acc_norm_stderr": 0.027682979522960238
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.026508590656233268,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.026508590656233268
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197768,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197768
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5903614457831325,
"acc_stderr": 0.038284011150790206,
"acc_norm": 0.5903614457831325,
"acc_norm_stderr": 0.038284011150790206
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.783625730994152,
"acc_stderr": 0.031581495393387324,
"acc_norm": 0.783625730994152,
"acc_norm_stderr": 0.031581495393387324
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6083231334149327,
"mc1_stderr": 0.017087795881769636,
"mc2": 0.7613409869233712,
"mc2_stderr": 0.014044498113643967
},
"harness|winogrande|5": {
"acc": 0.8216258879242304,
"acc_stderr": 0.010759352014855932
},
"harness|gsm8k|5": {
"acc": 0.645185746777862,
"acc_stderr": 0.013179083387979202
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_abhishekchohan__SOLAR-10.7B-Instruct-Forest-DPO-v1 | [
"region:us"
] | 2024-02-15T16:21:27+00:00 | {"pretty_name": "Evaluation run of abhishekchohan/SOLAR-10.7B-Instruct-Forest-DPO-v1", "dataset_summary": "Dataset automatically created during the evaluation run of model [abhishekchohan/SOLAR-10.7B-Instruct-Forest-DPO-v1](https://huggingface.co/abhishekchohan/SOLAR-10.7B-Instruct-Forest-DPO-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_abhishekchohan__SOLAR-10.7B-Instruct-Forest-DPO-v1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-15T16:19:12.135901](https://huggingface.co/datasets/open-llm-leaderboard/details_abhishekchohan__SOLAR-10.7B-Instruct-Forest-DPO-v1/blob/main/results_2024-02-15T16-19-12.135901.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6601535530792063,\n \"acc_stderr\": 0.03180088010860255,\n \"acc_norm\": 0.6612267563136458,\n \"acc_norm_stderr\": 0.03244397413229155,\n \"mc1\": 0.6083231334149327,\n \"mc1_stderr\": 0.017087795881769636,\n \"mc2\": 0.7613409869233712,\n \"mc2_stderr\": 0.014044498113643967\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6936860068259386,\n \"acc_stderr\": 0.013470584417276511,\n \"acc_norm\": 0.7192832764505119,\n \"acc_norm_stderr\": 0.013131238126975583\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7001593308105954,\n \"acc_stderr\": 0.004572515919210697,\n \"acc_norm\": 0.8843855805616411,\n \"acc_norm_stderr\": 0.0031910847927931543\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n \"acc_stderr\": 0.04244633238353227,\n \"acc_norm\": 0.5925925925925926,\n \"acc_norm_stderr\": 0.04244633238353227\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7368421052631579,\n \"acc_stderr\": 0.03583496176361073,\n \"acc_norm\": 0.7368421052631579,\n \"acc_norm_stderr\": 0.03583496176361073\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.02863723563980089,\n \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.02863723563980089\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6297872340425532,\n \"acc_stderr\": 0.03156564682236785,\n \"acc_norm\": 0.6297872340425532,\n \"acc_norm_stderr\": 0.03156564682236785\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n \"acc_stderr\": 0.047028804320496165,\n \"acc_norm\": 0.5087719298245614,\n \"acc_norm_stderr\": 0.047028804320496165\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.040824829046386284,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.040824829046386284\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4708994708994709,\n \"acc_stderr\": 0.02570765861415496,\n \"acc_norm\": 0.4708994708994709,\n \"acc_norm_stderr\": 0.02570765861415496\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n \"acc_stderr\": 0.023287665127268545,\n \"acc_norm\": 0.7870967741935484,\n \"acc_norm_stderr\": 0.023287665127268545\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.0351760354036101,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.0351760354036101\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8121212121212121,\n \"acc_stderr\": 0.03050193405942914,\n \"acc_norm\": 0.8121212121212121,\n \"acc_norm_stderr\": 0.03050193405942914\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8535353535353535,\n \"acc_stderr\": 0.025190921114603915,\n \"acc_norm\": 0.8535353535353535,\n \"acc_norm_stderr\": 0.025190921114603915\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6564102564102564,\n \"acc_stderr\": 0.02407869658063547,\n \"acc_norm\": 0.6564102564102564,\n \"acc_norm_stderr\": 0.02407869658063547\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3592592592592593,\n \"acc_stderr\": 0.029252905927251976,\n \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.029252905927251976\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.03006676158297793,\n \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.03006676158297793\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3841059602649007,\n \"acc_stderr\": 0.03971301814719197,\n \"acc_norm\": 0.3841059602649007,\n \"acc_norm_stderr\": 0.03971301814719197\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8403669724770643,\n \"acc_stderr\": 0.015703498348461777,\n \"acc_norm\": 0.8403669724770643,\n \"acc_norm_stderr\": 0.015703498348461777\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5787037037037037,\n \"acc_stderr\": 0.033674621388960775,\n \"acc_norm\": 0.5787037037037037,\n \"acc_norm_stderr\": 0.033674621388960775\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8578431372549019,\n \"acc_stderr\": 0.02450980392156862,\n \"acc_norm\": 0.8578431372549019,\n \"acc_norm_stderr\": 0.02450980392156862\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8354430379746836,\n \"acc_stderr\": 0.024135736240566932,\n \"acc_norm\": 0.8354430379746836,\n \"acc_norm_stderr\": 0.024135736240566932\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.031024411740572213,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.031024411740572213\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7404580152671756,\n \"acc_stderr\": 0.03844876139785271,\n \"acc_norm\": 0.7404580152671756,\n \"acc_norm_stderr\": 0.03844876139785271\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8181818181818182,\n \"acc_stderr\": 0.03520893951097654,\n \"acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.03520893951097654\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8349514563106796,\n \"acc_stderr\": 0.03675668832233188,\n \"acc_norm\": 0.8349514563106796,\n \"acc_norm_stderr\": 0.03675668832233188\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n \"acc_stderr\": 0.022801382534597524,\n \"acc_norm\": 0.8589743589743589,\n \"acc_norm_stderr\": 0.022801382534597524\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7969348659003831,\n \"acc_stderr\": 0.014385525076611573,\n \"acc_norm\": 0.7969348659003831,\n \"acc_norm_stderr\": 0.014385525076611573\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.02394851290546837,\n \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.02394851290546837\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4301675977653631,\n \"acc_stderr\": 0.01655860163604104,\n \"acc_norm\": 0.4301675977653631,\n \"acc_norm_stderr\": 0.01655860163604104\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7483660130718954,\n \"acc_stderr\": 0.024848018263875192,\n \"acc_norm\": 0.7483660130718954,\n \"acc_norm_stderr\": 0.024848018263875192\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n \"acc_stderr\": 0.025670259242188936,\n \"acc_norm\": 0.7138263665594855,\n \"acc_norm_stderr\": 0.025670259242188936\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7962962962962963,\n \"acc_stderr\": 0.02240967454730418,\n \"acc_norm\": 0.7962962962962963,\n \"acc_norm_stderr\": 0.02240967454730418\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5283687943262412,\n \"acc_stderr\": 0.02977945095730307,\n \"acc_norm\": 0.5283687943262412,\n \"acc_norm_stderr\": 0.02977945095730307\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5013037809647979,\n \"acc_stderr\": 0.012770192691057109,\n \"acc_norm\": 0.5013037809647979,\n \"acc_norm_stderr\": 0.012770192691057109\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7426470588235294,\n \"acc_stderr\": 0.026556519470041513,\n \"acc_norm\": 0.7426470588235294,\n \"acc_norm_stderr\": 0.026556519470041513\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6879084967320261,\n \"acc_stderr\": 0.01874501120127766,\n \"acc_norm\": 0.6879084967320261,\n \"acc_norm_stderr\": 0.01874501120127766\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7510204081632653,\n \"acc_stderr\": 0.027682979522960238,\n \"acc_norm\": 0.7510204081632653,\n \"acc_norm_stderr\": 0.027682979522960238\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n \"acc_stderr\": 0.026508590656233268,\n \"acc_norm\": 0.8308457711442786,\n \"acc_norm_stderr\": 0.026508590656233268\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197768,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197768\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5903614457831325,\n \"acc_stderr\": 0.038284011150790206,\n \"acc_norm\": 0.5903614457831325,\n \"acc_norm_stderr\": 0.038284011150790206\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.783625730994152,\n \"acc_stderr\": 0.031581495393387324,\n \"acc_norm\": 0.783625730994152,\n \"acc_norm_stderr\": 0.031581495393387324\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6083231334149327,\n \"mc1_stderr\": 0.017087795881769636,\n \"mc2\": 0.7613409869233712,\n \"mc2_stderr\": 0.014044498113643967\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8216258879242304,\n \"acc_stderr\": 0.010759352014855932\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.645185746777862,\n \"acc_stderr\": 0.013179083387979202\n }\n}\n```", "repo_url": "https://huggingface.co/abhishekchohan/SOLAR-10.7B-Instruct-Forest-DPO-v1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_15T16_19_12.135901", "path": ["**/details_harness|arc:challenge|25_2024-02-15T16-19-12.135901.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-15T16-19-12.135901.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_15T16_19_12.135901", "path": ["**/details_harness|gsm8k|5_2024-02-15T16-19-12.135901.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-15T16-19-12.135901.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_15T16_19_12.135901", "path": ["**/details_harness|hellaswag|10_2024-02-15T16-19-12.135901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-15T16-19-12.135901.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_15T16_19_12.135901", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-15T16-19-12.135901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-15T16-19-12.135901.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-15T16-19-12.135901.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_15T16_19_12.135901", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T16-19-12.135901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T16-19-12.135901.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_15T16_19_12.135901", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-15T16-19-12.135901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-15T16-19-12.135901.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_15T16_19_12.135901", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-15T16-19-12.135901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-15T16-19-12.135901.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_15T16_19_12.135901", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T16-19-12.135901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T16-19-12.135901.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_15T16_19_12.135901", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T16-19-12.135901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T16-19-12.135901.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_15T16_19_12.135901", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-15T16-19-12.135901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-15T16-19-12.135901.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_15T16_19_12.135901", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T16-19-12.135901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T16-19-12.135901.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_15T16_19_12.135901", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T16-19-12.135901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T16-19-12.135901.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_15T16_19_12.135901", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T16-19-12.135901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T16-19-12.135901.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_15T16_19_12.135901", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T16-19-12.135901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T16-19-12.135901.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_15T16_19_12.135901", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-15T16-19-12.135901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-15T16-19-12.135901.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_15T16_19_12.135901", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-15T16-19-12.135901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-15T16-19-12.135901.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_15T16_19_12.135901", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T16-19-12.135901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T16-19-12.135901.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_15T16_19_12.135901", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-15T16-19-12.135901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-15T16-19-12.135901.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_15T16_19_12.135901", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T16-19-12.135901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T16-19-12.135901.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_15T16_19_12.135901", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T16-19-12.135901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T16-19-12.135901.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_15T16_19_12.135901", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T16-19-12.135901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T16-19-12.135901.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_15T16_19_12.135901", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-15T16-19-12.135901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-15T16-19-12.135901.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_15T16_19_12.135901", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T16-19-12.135901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T16-19-12.135901.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_15T16_19_12.135901", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T16-19-12.135901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T16-19-12.135901.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_15T16_19_12.135901", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T16-19-12.135901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T16-19-12.135901.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_15T16_19_12.135901", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T16-19-12.135901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T16-19-12.135901.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_15T16_19_12.135901", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T16-19-12.135901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T16-19-12.135901.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_15T16_19_12.135901", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T16-19-12.135901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T16-19-12.135901.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_15T16_19_12.135901", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T16-19-12.135901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T16-19-12.135901.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_15T16_19_12.135901", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T16-19-12.135901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T16-19-12.135901.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_15T16_19_12.135901", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T16-19-12.135901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T16-19-12.135901.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_15T16_19_12.135901", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T16-19-12.135901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T16-19-12.135901.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_15T16_19_12.135901", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T16-19-12.135901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T16-19-12.135901.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_15T16_19_12.135901", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T16-19-12.135901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T16-19-12.135901.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_15T16_19_12.135901", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T16-19-12.135901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T16-19-12.135901.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_15T16_19_12.135901", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T16-19-12.135901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T16-19-12.135901.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_15T16_19_12.135901", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-15T16-19-12.135901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-15T16-19-12.135901.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_15T16_19_12.135901", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T16-19-12.135901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T16-19-12.135901.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_15T16_19_12.135901", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-15T16-19-12.135901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-15T16-19-12.135901.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_15T16_19_12.135901", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T16-19-12.135901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T16-19-12.135901.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_15T16_19_12.135901", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T16-19-12.135901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T16-19-12.135901.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_15T16_19_12.135901", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T16-19-12.135901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T16-19-12.135901.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_15T16_19_12.135901", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-15T16-19-12.135901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-15T16-19-12.135901.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_15T16_19_12.135901", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-15T16-19-12.135901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-15T16-19-12.135901.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_15T16_19_12.135901", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T16-19-12.135901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T16-19-12.135901.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_15T16_19_12.135901", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T16-19-12.135901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T16-19-12.135901.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_15T16_19_12.135901", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T16-19-12.135901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T16-19-12.135901.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_15T16_19_12.135901", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T16-19-12.135901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T16-19-12.135901.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_15T16_19_12.135901", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-15T16-19-12.135901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-15T16-19-12.135901.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_15T16_19_12.135901", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-15T16-19-12.135901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-15T16-19-12.135901.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_15T16_19_12.135901", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-15T16-19-12.135901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-15T16-19-12.135901.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_15T16_19_12.135901", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T16-19-12.135901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T16-19-12.135901.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_15T16_19_12.135901", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-15T16-19-12.135901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-15T16-19-12.135901.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_15T16_19_12.135901", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T16-19-12.135901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T16-19-12.135901.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_15T16_19_12.135901", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T16-19-12.135901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T16-19-12.135901.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_15T16_19_12.135901", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-15T16-19-12.135901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-15T16-19-12.135901.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_15T16_19_12.135901", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-15T16-19-12.135901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-15T16-19-12.135901.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_15T16_19_12.135901", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-15T16-19-12.135901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-15T16-19-12.135901.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_15T16_19_12.135901", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T16-19-12.135901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T16-19-12.135901.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_15T16_19_12.135901", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-15T16-19-12.135901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-15T16-19-12.135901.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_15T16_19_12.135901", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-15T16-19-12.135901.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-15T16-19-12.135901.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_15T16_19_12.135901", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-15T16-19-12.135901.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-15T16-19-12.135901.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_15T16_19_12.135901", "path": ["**/details_harness|winogrande|5_2024-02-15T16-19-12.135901.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-15T16-19-12.135901.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_15T16_19_12.135901", "path": ["results_2024-02-15T16-19-12.135901.parquet"]}, {"split": "latest", "path": ["results_2024-02-15T16-19-12.135901.parquet"]}]}]} | 2024-02-15T16:21:51+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of abhishekchohan/SOLAR-10.7B-Instruct-Forest-DPO-v1
Dataset automatically created during the evaluation run of model abhishekchohan/SOLAR-10.7B-Instruct-Forest-DPO-v1 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-15T16:19:12.135901(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of abhishekchohan/SOLAR-10.7B-Instruct-Forest-DPO-v1\n\n\n\nDataset automatically created during the evaluation run of model abhishekchohan/SOLAR-10.7B-Instruct-Forest-DPO-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-15T16:19:12.135901(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of abhishekchohan/SOLAR-10.7B-Instruct-Forest-DPO-v1\n\n\n\nDataset automatically created during the evaluation run of model abhishekchohan/SOLAR-10.7B-Instruct-Forest-DPO-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-15T16:19:12.135901(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
37dbd4888642f565a75da7fd15d5bdc321ed220f |
# Dataset Card for Evaluation run of yleo/ParrotOgno-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [yleo/ParrotOgno-7B](https://huggingface.co/yleo/ParrotOgno-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_yleo__ParrotOgno-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-15T16:28:55.072793](https://huggingface.co/datasets/open-llm-leaderboard/details_yleo__ParrotOgno-7B/blob/main/results_2024-02-15T16-28-55.072793.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.651472054199089,
"acc_stderr": 0.0320071819287666,
"acc_norm": 0.6506761514645453,
"acc_norm_stderr": 0.03267799309361849,
"mc1": 0.6181150550795593,
"mc1_stderr": 0.017008101939163498,
"mc2": 0.7652952718521188,
"mc2_stderr": 0.013990406463043562
},
"harness|arc:challenge|25": {
"acc": 0.7141638225255973,
"acc_stderr": 0.013203196088537372,
"acc_norm": 0.7303754266211604,
"acc_norm_stderr": 0.012968040686869148
},
"harness|hellaswag|10": {
"acc": 0.714299940250946,
"acc_stderr": 0.004508239594503832,
"acc_norm": 0.8902609042023502,
"acc_norm_stderr": 0.0031192548288489453
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.02815283794249387,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.02815283794249387
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.025355741263055273,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.025355741263055273
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.49206349206349204,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.49206349206349204,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.023415293433568525,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.023415293433568525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.917098445595855,
"acc_stderr": 0.01989934131572178,
"acc_norm": 0.917098445595855,
"acc_norm_stderr": 0.01989934131572178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.015555802713590167,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.015555802713590167
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.025195658428931792,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.025195658428931792
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290916,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8250319284802043,
"acc_stderr": 0.013586619219903347,
"acc_norm": 0.8250319284802043,
"acc_norm_stderr": 0.013586619219903347
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7369942196531792,
"acc_stderr": 0.023703099525258172,
"acc_norm": 0.7369942196531792,
"acc_norm_stderr": 0.023703099525258172
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4446927374301676,
"acc_stderr": 0.01661988198817702,
"acc_norm": 0.4446927374301676,
"acc_norm_stderr": 0.01661988198817702
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.02555316999182652,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.02555316999182652
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.02575586592263295,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.02575586592263295
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.024477222856135114,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.024477222856135114
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4706649282920469,
"acc_stderr": 0.012748238397365549,
"acc_norm": 0.4706649282920469,
"acc_norm_stderr": 0.012748238397365549
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.02824568739146292,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.02824568739146292
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.018999707383162673,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.018999707383162673
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.02619392354445412,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.02619392354445412
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699121,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699121
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160893,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160893
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6181150550795593,
"mc1_stderr": 0.017008101939163498,
"mc2": 0.7652952718521188,
"mc2_stderr": 0.013990406463043562
},
"harness|winogrande|5": {
"acc": 0.846093133385951,
"acc_stderr": 0.010141944523750035
},
"harness|gsm8k|5": {
"acc": 0.6959818043972706,
"acc_stderr": 0.012670420440198664
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_yleo__ParrotOgno-7B | [
"region:us"
] | 2024-02-15T16:31:12+00:00 | {"pretty_name": "Evaluation run of yleo/ParrotOgno-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [yleo/ParrotOgno-7B](https://huggingface.co/yleo/ParrotOgno-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yleo__ParrotOgno-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-15T16:28:55.072793](https://huggingface.co/datasets/open-llm-leaderboard/details_yleo__ParrotOgno-7B/blob/main/results_2024-02-15T16-28-55.072793.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.651472054199089,\n \"acc_stderr\": 0.0320071819287666,\n \"acc_norm\": 0.6506761514645453,\n \"acc_norm_stderr\": 0.03267799309361849,\n \"mc1\": 0.6181150550795593,\n \"mc1_stderr\": 0.017008101939163498,\n \"mc2\": 0.7652952718521188,\n \"mc2_stderr\": 0.013990406463043562\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7141638225255973,\n \"acc_stderr\": 0.013203196088537372,\n \"acc_norm\": 0.7303754266211604,\n \"acc_norm_stderr\": 0.012968040686869148\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.714299940250946,\n \"acc_stderr\": 0.004508239594503832,\n \"acc_norm\": 0.8902609042023502,\n \"acc_norm_stderr\": 0.0031192548288489453\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.02815283794249387,\n \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.02815283794249387\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.025355741263055273,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.025355741263055273\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.49206349206349204,\n \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.49206349206349204,\n \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n \"acc_stderr\": 0.023415293433568525,\n \"acc_norm\": 0.7838709677419354,\n \"acc_norm_stderr\": 0.023415293433568525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.917098445595855,\n \"acc_stderr\": 0.01989934131572178,\n \"acc_norm\": 0.917098445595855,\n \"acc_norm_stderr\": 0.01989934131572178\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8440366972477065,\n \"acc_stderr\": 0.015555802713590167,\n \"acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.015555802713590167\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931792,\n \"acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931792\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290916,\n \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290916\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8250319284802043,\n \"acc_stderr\": 0.013586619219903347,\n \"acc_norm\": 0.8250319284802043,\n \"acc_norm_stderr\": 0.013586619219903347\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.023703099525258172,\n \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.023703099525258172\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4446927374301676,\n \"acc_stderr\": 0.01661988198817702,\n \"acc_norm\": 0.4446927374301676,\n \"acc_norm_stderr\": 0.01661988198817702\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.02555316999182652,\n \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.02555316999182652\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n \"acc_stderr\": 0.02575586592263295,\n \"acc_norm\": 0.7106109324758842,\n \"acc_norm_stderr\": 0.02575586592263295\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135114,\n \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135114\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4706649282920469,\n \"acc_stderr\": 0.012748238397365549,\n \"acc_norm\": 0.4706649282920469,\n \"acc_norm_stderr\": 0.012748238397365549\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.02824568739146292,\n \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.02824568739146292\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6715686274509803,\n \"acc_stderr\": 0.018999707383162673,\n \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.018999707383162673\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.02619392354445412,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.02619392354445412\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.5602409638554217,\n \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6181150550795593,\n \"mc1_stderr\": 0.017008101939163498,\n \"mc2\": 0.7652952718521188,\n \"mc2_stderr\": 0.013990406463043562\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.846093133385951,\n \"acc_stderr\": 0.010141944523750035\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6959818043972706,\n \"acc_stderr\": 0.012670420440198664\n }\n}\n```", "repo_url": "https://huggingface.co/yleo/ParrotOgno-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_15T16_28_55.072793", "path": ["**/details_harness|arc:challenge|25_2024-02-15T16-28-55.072793.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-15T16-28-55.072793.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_15T16_28_55.072793", "path": ["**/details_harness|gsm8k|5_2024-02-15T16-28-55.072793.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-15T16-28-55.072793.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_15T16_28_55.072793", "path": ["**/details_harness|hellaswag|10_2024-02-15T16-28-55.072793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-15T16-28-55.072793.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_15T16_28_55.072793", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-15T16-28-55.072793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-15T16-28-55.072793.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-15T16-28-55.072793.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_15T16_28_55.072793", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T16-28-55.072793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T16-28-55.072793.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_15T16_28_55.072793", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-15T16-28-55.072793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-15T16-28-55.072793.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_15T16_28_55.072793", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-15T16-28-55.072793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-15T16-28-55.072793.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_15T16_28_55.072793", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T16-28-55.072793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T16-28-55.072793.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_15T16_28_55.072793", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T16-28-55.072793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T16-28-55.072793.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_15T16_28_55.072793", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-15T16-28-55.072793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-15T16-28-55.072793.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_15T16_28_55.072793", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T16-28-55.072793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T16-28-55.072793.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_15T16_28_55.072793", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T16-28-55.072793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T16-28-55.072793.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_15T16_28_55.072793", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T16-28-55.072793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T16-28-55.072793.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_15T16_28_55.072793", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T16-28-55.072793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T16-28-55.072793.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_15T16_28_55.072793", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-15T16-28-55.072793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-15T16-28-55.072793.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_15T16_28_55.072793", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-15T16-28-55.072793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-15T16-28-55.072793.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_15T16_28_55.072793", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T16-28-55.072793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T16-28-55.072793.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_15T16_28_55.072793", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-15T16-28-55.072793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-15T16-28-55.072793.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_15T16_28_55.072793", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T16-28-55.072793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T16-28-55.072793.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_15T16_28_55.072793", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T16-28-55.072793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T16-28-55.072793.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_15T16_28_55.072793", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T16-28-55.072793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T16-28-55.072793.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_15T16_28_55.072793", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-15T16-28-55.072793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-15T16-28-55.072793.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_15T16_28_55.072793", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T16-28-55.072793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T16-28-55.072793.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_15T16_28_55.072793", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T16-28-55.072793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T16-28-55.072793.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_15T16_28_55.072793", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T16-28-55.072793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T16-28-55.072793.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_15T16_28_55.072793", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T16-28-55.072793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T16-28-55.072793.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_15T16_28_55.072793", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T16-28-55.072793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T16-28-55.072793.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_15T16_28_55.072793", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T16-28-55.072793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T16-28-55.072793.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_15T16_28_55.072793", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T16-28-55.072793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T16-28-55.072793.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_15T16_28_55.072793", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T16-28-55.072793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T16-28-55.072793.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_15T16_28_55.072793", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T16-28-55.072793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T16-28-55.072793.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_15T16_28_55.072793", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T16-28-55.072793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T16-28-55.072793.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_15T16_28_55.072793", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T16-28-55.072793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T16-28-55.072793.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_15T16_28_55.072793", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T16-28-55.072793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T16-28-55.072793.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_15T16_28_55.072793", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T16-28-55.072793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T16-28-55.072793.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_15T16_28_55.072793", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T16-28-55.072793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T16-28-55.072793.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_15T16_28_55.072793", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-15T16-28-55.072793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-15T16-28-55.072793.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_15T16_28_55.072793", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T16-28-55.072793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T16-28-55.072793.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_15T16_28_55.072793", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-15T16-28-55.072793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-15T16-28-55.072793.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_15T16_28_55.072793", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T16-28-55.072793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T16-28-55.072793.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_15T16_28_55.072793", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T16-28-55.072793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T16-28-55.072793.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_15T16_28_55.072793", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T16-28-55.072793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T16-28-55.072793.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_15T16_28_55.072793", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-15T16-28-55.072793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-15T16-28-55.072793.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_15T16_28_55.072793", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-15T16-28-55.072793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-15T16-28-55.072793.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_15T16_28_55.072793", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T16-28-55.072793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T16-28-55.072793.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_15T16_28_55.072793", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T16-28-55.072793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T16-28-55.072793.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_15T16_28_55.072793", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T16-28-55.072793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T16-28-55.072793.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_15T16_28_55.072793", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T16-28-55.072793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T16-28-55.072793.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_15T16_28_55.072793", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-15T16-28-55.072793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-15T16-28-55.072793.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_15T16_28_55.072793", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-15T16-28-55.072793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-15T16-28-55.072793.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_15T16_28_55.072793", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-15T16-28-55.072793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-15T16-28-55.072793.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_15T16_28_55.072793", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T16-28-55.072793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T16-28-55.072793.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_15T16_28_55.072793", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-15T16-28-55.072793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-15T16-28-55.072793.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_15T16_28_55.072793", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T16-28-55.072793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T16-28-55.072793.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_15T16_28_55.072793", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T16-28-55.072793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T16-28-55.072793.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_15T16_28_55.072793", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-15T16-28-55.072793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-15T16-28-55.072793.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_15T16_28_55.072793", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-15T16-28-55.072793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-15T16-28-55.072793.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_15T16_28_55.072793", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-15T16-28-55.072793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-15T16-28-55.072793.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_15T16_28_55.072793", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T16-28-55.072793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T16-28-55.072793.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_15T16_28_55.072793", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-15T16-28-55.072793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-15T16-28-55.072793.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_15T16_28_55.072793", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-15T16-28-55.072793.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-15T16-28-55.072793.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_15T16_28_55.072793", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-15T16-28-55.072793.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-15T16-28-55.072793.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_15T16_28_55.072793", "path": ["**/details_harness|winogrande|5_2024-02-15T16-28-55.072793.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-15T16-28-55.072793.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_15T16_28_55.072793", "path": ["results_2024-02-15T16-28-55.072793.parquet"]}, {"split": "latest", "path": ["results_2024-02-15T16-28-55.072793.parquet"]}]}]} | 2024-02-15T16:31:38+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of yleo/ParrotOgno-7B
Dataset automatically created during the evaluation run of model yleo/ParrotOgno-7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-15T16:28:55.072793(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of yleo/ParrotOgno-7B\n\n\n\nDataset automatically created during the evaluation run of model yleo/ParrotOgno-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-15T16:28:55.072793(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of yleo/ParrotOgno-7B\n\n\n\nDataset automatically created during the evaluation run of model yleo/ParrotOgno-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-15T16:28:55.072793(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
b3a6a3290ccb2e40c9c43a28730190ffcccef790 |
# Dataset Card for Evaluation run of RaduGabriel/SirUkrainian
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [RaduGabriel/SirUkrainian](https://huggingface.co/RaduGabriel/SirUkrainian) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_RaduGabriel__SirUkrainian",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-15T16:52:40.545415](https://huggingface.co/datasets/open-llm-leaderboard/details_RaduGabriel__SirUkrainian/blob/main/results_2024-02-15T16-52-40.545415.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6341358887316704,
"acc_stderr": 0.03253233597406997,
"acc_norm": 0.6358767386408157,
"acc_norm_stderr": 0.03319318434906525,
"mc1": 0.5324357405140759,
"mc1_stderr": 0.017466632149577613,
"mc2": 0.6873616277110236,
"mc2_stderr": 0.014862338695256647
},
"harness|arc:challenge|25": {
"acc": 0.6348122866894198,
"acc_stderr": 0.014070265519268802,
"acc_norm": 0.6732081911262798,
"acc_norm_stderr": 0.013706665975587328
},
"harness|hellaswag|10": {
"acc": 0.676956781517626,
"acc_stderr": 0.004666833452796188,
"acc_norm": 0.8554072893845848,
"acc_norm_stderr": 0.0035097096477918377
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353228,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353228
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119668,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119668
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.04960449637488583,
"acc_norm": 0.58,
"acc_norm_stderr": 0.04960449637488583
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.02815283794249386,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.02815283794249386
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7291666666666666,
"acc_stderr": 0.03716177437566017,
"acc_norm": 0.7291666666666666,
"acc_norm_stderr": 0.03716177437566017
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247078,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247078
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266344,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266344
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146267,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146267
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370332,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370332
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4021164021164021,
"acc_stderr": 0.025253032554997695,
"acc_norm": 0.4021164021164021,
"acc_norm_stderr": 0.025253032554997695
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5,
"acc_stderr": 0.04472135954999579,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04472135954999579
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7580645161290323,
"acc_stderr": 0.024362599693031086,
"acc_norm": 0.7580645161290323,
"acc_norm_stderr": 0.024362599693031086
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5467980295566502,
"acc_stderr": 0.03502544650845872,
"acc_norm": 0.5467980295566502,
"acc_norm_stderr": 0.03502544650845872
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009182,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009182
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.029857515673386414,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.029857515673386414
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.023381935348121427,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.023381935348121427
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6487179487179487,
"acc_stderr": 0.024203665177902803,
"acc_norm": 0.6487179487179487,
"acc_norm_stderr": 0.024203665177902803
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.029185714949857413,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.029185714949857413
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.030489911417673227,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.030489911417673227
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8201834862385321,
"acc_stderr": 0.016465345467391514,
"acc_norm": 0.8201834862385321,
"acc_norm_stderr": 0.016465345467391514
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7843137254901961,
"acc_stderr": 0.028867431449849316,
"acc_norm": 0.7843137254901961,
"acc_norm_stderr": 0.028867431449849316
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7890295358649789,
"acc_stderr": 0.026558372502661916,
"acc_norm": 0.7890295358649789,
"acc_norm_stderr": 0.026558372502661916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.03160295143776679,
"acc_norm": 0.6681614349775785,
"acc_norm_stderr": 0.03160295143776679
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7404580152671756,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.7404580152671756,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.043300437496507416,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.043300437496507416
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7116564417177914,
"acc_stderr": 0.035590395316173425,
"acc_norm": 0.7116564417177914,
"acc_norm_stderr": 0.035590395316173425
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092368,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092368
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8109833971902938,
"acc_stderr": 0.014000791294406999,
"acc_norm": 0.8109833971902938,
"acc_norm_stderr": 0.014000791294406999
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.024946792225272314,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.024946792225272314
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.48268156424581005,
"acc_stderr": 0.01671246744170252,
"acc_norm": 0.48268156424581005,
"acc_norm_stderr": 0.01671246744170252
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.026090162504279056,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.026090162504279056
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6720257234726688,
"acc_stderr": 0.02666441088693762,
"acc_norm": 0.6720257234726688,
"acc_norm_stderr": 0.02666441088693762
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6882716049382716,
"acc_stderr": 0.02577311116963046,
"acc_norm": 0.6882716049382716,
"acc_norm_stderr": 0.02577311116963046
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4641460234680574,
"acc_stderr": 0.012737361318730581,
"acc_norm": 0.4641460234680574,
"acc_norm_stderr": 0.012737361318730581
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6433823529411765,
"acc_stderr": 0.029097209568411955,
"acc_norm": 0.6433823529411765,
"acc_norm_stderr": 0.029097209568411955
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6568627450980392,
"acc_stderr": 0.01920660684882536,
"acc_norm": 0.6568627450980392,
"acc_norm_stderr": 0.01920660684882536
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.689795918367347,
"acc_stderr": 0.029613459872484375,
"acc_norm": 0.689795918367347,
"acc_norm_stderr": 0.029613459872484375
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.02519692987482707,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.02519692987482707
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.035887028128263686,
"acc_norm": 0.85,
"acc_norm_stderr": 0.035887028128263686
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5060240963855421,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.5060240963855421,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8654970760233918,
"acc_stderr": 0.026168221344662297,
"acc_norm": 0.8654970760233918,
"acc_norm_stderr": 0.026168221344662297
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5324357405140759,
"mc1_stderr": 0.017466632149577613,
"mc2": 0.6873616277110236,
"mc2_stderr": 0.014862338695256647
},
"harness|winogrande|5": {
"acc": 0.8153117600631413,
"acc_stderr": 0.010905978112156878
},
"harness|gsm8k|5": {
"acc": 0.5670962850644428,
"acc_stderr": 0.013647916362576045
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_RaduGabriel__SirUkrainian | [
"region:us"
] | 2024-02-15T16:55:01+00:00 | {"pretty_name": "Evaluation run of RaduGabriel/SirUkrainian", "dataset_summary": "Dataset automatically created during the evaluation run of model [RaduGabriel/SirUkrainian](https://huggingface.co/RaduGabriel/SirUkrainian) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_RaduGabriel__SirUkrainian\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-15T16:52:40.545415](https://huggingface.co/datasets/open-llm-leaderboard/details_RaduGabriel__SirUkrainian/blob/main/results_2024-02-15T16-52-40.545415.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6341358887316704,\n \"acc_stderr\": 0.03253233597406997,\n \"acc_norm\": 0.6358767386408157,\n \"acc_norm_stderr\": 0.03319318434906525,\n \"mc1\": 0.5324357405140759,\n \"mc1_stderr\": 0.017466632149577613,\n \"mc2\": 0.6873616277110236,\n \"mc2_stderr\": 0.014862338695256647\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6348122866894198,\n \"acc_stderr\": 0.014070265519268802,\n \"acc_norm\": 0.6732081911262798,\n \"acc_norm_stderr\": 0.013706665975587328\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.676956781517626,\n \"acc_stderr\": 0.004666833452796188,\n \"acc_norm\": 0.8554072893845848,\n \"acc_norm_stderr\": 0.0035097096477918377\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n \"acc_stderr\": 0.04244633238353228,\n \"acc_norm\": 0.5925925925925926,\n \"acc_norm_stderr\": 0.04244633238353228\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119668,\n \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119668\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.04960449637488583,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.04960449637488583\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.02815283794249386,\n \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.02815283794249386\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7291666666666666,\n \"acc_stderr\": 0.03716177437566017,\n \"acc_norm\": 0.7291666666666666,\n \"acc_norm_stderr\": 0.03716177437566017\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247078,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247078\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266344,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266344\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146267,\n \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146267\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370332,\n \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370332\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4021164021164021,\n \"acc_stderr\": 0.025253032554997695,\n \"acc_norm\": 0.4021164021164021,\n \"acc_norm_stderr\": 0.025253032554997695\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04472135954999579,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04472135954999579\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7580645161290323,\n \"acc_stderr\": 0.024362599693031086,\n \"acc_norm\": 0.7580645161290323,\n \"acc_norm_stderr\": 0.024362599693031086\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5467980295566502,\n \"acc_stderr\": 0.03502544650845872,\n \"acc_norm\": 0.5467980295566502,\n \"acc_norm_stderr\": 0.03502544650845872\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7727272727272727,\n \"acc_stderr\": 0.029857515673386414,\n \"acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.029857515673386414\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.023381935348121427,\n \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.023381935348121427\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6487179487179487,\n \"acc_stderr\": 0.024203665177902803,\n \"acc_norm\": 0.6487179487179487,\n \"acc_norm_stderr\": 0.024203665177902803\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.35555555555555557,\n \"acc_stderr\": 0.029185714949857413,\n \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.029185714949857413\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.030489911417673227,\n \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.030489911417673227\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8201834862385321,\n \"acc_stderr\": 0.016465345467391514,\n \"acc_norm\": 0.8201834862385321,\n \"acc_norm_stderr\": 0.016465345467391514\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7843137254901961,\n \"acc_stderr\": 0.028867431449849316,\n \"acc_norm\": 0.7843137254901961,\n \"acc_norm_stderr\": 0.028867431449849316\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7890295358649789,\n \"acc_stderr\": 0.026558372502661916,\n \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.026558372502661916\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n \"acc_stderr\": 0.03160295143776679,\n \"acc_norm\": 0.6681614349775785,\n \"acc_norm_stderr\": 0.03160295143776679\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7404580152671756,\n \"acc_stderr\": 0.03844876139785271,\n \"acc_norm\": 0.7404580152671756,\n \"acc_norm_stderr\": 0.03844876139785271\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.043300437496507416,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.043300437496507416\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7116564417177914,\n \"acc_stderr\": 0.035590395316173425,\n \"acc_norm\": 0.7116564417177914,\n \"acc_norm_stderr\": 0.035590395316173425\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.020588491316092368,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.020588491316092368\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8109833971902938,\n \"acc_stderr\": 0.014000791294406999,\n \"acc_norm\": 0.8109833971902938,\n \"acc_norm_stderr\": 0.014000791294406999\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6878612716763006,\n \"acc_stderr\": 0.024946792225272314,\n \"acc_norm\": 0.6878612716763006,\n \"acc_norm_stderr\": 0.024946792225272314\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.48268156424581005,\n \"acc_stderr\": 0.01671246744170252,\n \"acc_norm\": 0.48268156424581005,\n \"acc_norm_stderr\": 0.01671246744170252\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.026090162504279056,\n \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.026090162504279056\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6720257234726688,\n \"acc_stderr\": 0.02666441088693762,\n \"acc_norm\": 0.6720257234726688,\n \"acc_norm_stderr\": 0.02666441088693762\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6882716049382716,\n \"acc_stderr\": 0.02577311116963046,\n \"acc_norm\": 0.6882716049382716,\n \"acc_norm_stderr\": 0.02577311116963046\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4641460234680574,\n \"acc_stderr\": 0.012737361318730581,\n \"acc_norm\": 0.4641460234680574,\n \"acc_norm_stderr\": 0.012737361318730581\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6433823529411765,\n \"acc_stderr\": 0.029097209568411955,\n \"acc_norm\": 0.6433823529411765,\n \"acc_norm_stderr\": 0.029097209568411955\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6568627450980392,\n \"acc_stderr\": 0.01920660684882536,\n \"acc_norm\": 0.6568627450980392,\n \"acc_norm_stderr\": 0.01920660684882536\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.689795918367347,\n \"acc_stderr\": 0.029613459872484375,\n \"acc_norm\": 0.689795918367347,\n \"acc_norm_stderr\": 0.029613459872484375\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n \"acc_stderr\": 0.02519692987482707,\n \"acc_norm\": 0.8507462686567164,\n \"acc_norm_stderr\": 0.02519692987482707\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.035887028128263686,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.035887028128263686\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8654970760233918,\n \"acc_stderr\": 0.026168221344662297,\n \"acc_norm\": 0.8654970760233918,\n \"acc_norm_stderr\": 0.026168221344662297\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5324357405140759,\n \"mc1_stderr\": 0.017466632149577613,\n \"mc2\": 0.6873616277110236,\n \"mc2_stderr\": 0.014862338695256647\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8153117600631413,\n \"acc_stderr\": 0.010905978112156878\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5670962850644428,\n \"acc_stderr\": 0.013647916362576045\n }\n}\n```", "repo_url": "https://huggingface.co/RaduGabriel/SirUkrainian", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_15T16_52_40.545415", "path": ["**/details_harness|arc:challenge|25_2024-02-15T16-52-40.545415.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-15T16-52-40.545415.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_15T16_52_40.545415", "path": ["**/details_harness|gsm8k|5_2024-02-15T16-52-40.545415.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-15T16-52-40.545415.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_15T16_52_40.545415", "path": ["**/details_harness|hellaswag|10_2024-02-15T16-52-40.545415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-15T16-52-40.545415.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_15T16_52_40.545415", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-15T16-52-40.545415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-15T16-52-40.545415.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-15T16-52-40.545415.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_15T16_52_40.545415", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T16-52-40.545415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T16-52-40.545415.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_15T16_52_40.545415", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-15T16-52-40.545415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-15T16-52-40.545415.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_15T16_52_40.545415", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-15T16-52-40.545415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-15T16-52-40.545415.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_15T16_52_40.545415", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T16-52-40.545415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T16-52-40.545415.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_15T16_52_40.545415", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T16-52-40.545415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T16-52-40.545415.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_15T16_52_40.545415", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-15T16-52-40.545415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-15T16-52-40.545415.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_15T16_52_40.545415", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T16-52-40.545415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T16-52-40.545415.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_15T16_52_40.545415", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T16-52-40.545415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T16-52-40.545415.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_15T16_52_40.545415", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T16-52-40.545415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T16-52-40.545415.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_15T16_52_40.545415", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T16-52-40.545415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T16-52-40.545415.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_15T16_52_40.545415", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-15T16-52-40.545415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-15T16-52-40.545415.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_15T16_52_40.545415", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-15T16-52-40.545415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-15T16-52-40.545415.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_15T16_52_40.545415", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T16-52-40.545415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T16-52-40.545415.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_15T16_52_40.545415", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-15T16-52-40.545415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-15T16-52-40.545415.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_15T16_52_40.545415", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T16-52-40.545415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T16-52-40.545415.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_15T16_52_40.545415", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T16-52-40.545415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T16-52-40.545415.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_15T16_52_40.545415", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T16-52-40.545415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T16-52-40.545415.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_15T16_52_40.545415", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-15T16-52-40.545415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-15T16-52-40.545415.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_15T16_52_40.545415", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T16-52-40.545415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T16-52-40.545415.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_15T16_52_40.545415", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T16-52-40.545415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T16-52-40.545415.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_15T16_52_40.545415", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T16-52-40.545415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T16-52-40.545415.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_15T16_52_40.545415", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T16-52-40.545415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T16-52-40.545415.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_15T16_52_40.545415", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T16-52-40.545415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T16-52-40.545415.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_15T16_52_40.545415", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T16-52-40.545415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T16-52-40.545415.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_15T16_52_40.545415", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T16-52-40.545415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T16-52-40.545415.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_15T16_52_40.545415", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T16-52-40.545415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T16-52-40.545415.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_15T16_52_40.545415", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T16-52-40.545415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T16-52-40.545415.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_15T16_52_40.545415", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T16-52-40.545415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T16-52-40.545415.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_15T16_52_40.545415", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T16-52-40.545415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T16-52-40.545415.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_15T16_52_40.545415", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T16-52-40.545415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T16-52-40.545415.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_15T16_52_40.545415", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T16-52-40.545415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T16-52-40.545415.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_15T16_52_40.545415", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T16-52-40.545415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T16-52-40.545415.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_15T16_52_40.545415", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-15T16-52-40.545415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-15T16-52-40.545415.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_15T16_52_40.545415", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T16-52-40.545415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T16-52-40.545415.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_15T16_52_40.545415", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-15T16-52-40.545415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-15T16-52-40.545415.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_15T16_52_40.545415", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T16-52-40.545415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T16-52-40.545415.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_15T16_52_40.545415", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T16-52-40.545415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T16-52-40.545415.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_15T16_52_40.545415", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T16-52-40.545415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T16-52-40.545415.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_15T16_52_40.545415", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-15T16-52-40.545415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-15T16-52-40.545415.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_15T16_52_40.545415", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-15T16-52-40.545415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-15T16-52-40.545415.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_15T16_52_40.545415", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T16-52-40.545415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T16-52-40.545415.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_15T16_52_40.545415", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T16-52-40.545415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T16-52-40.545415.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_15T16_52_40.545415", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T16-52-40.545415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T16-52-40.545415.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_15T16_52_40.545415", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T16-52-40.545415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T16-52-40.545415.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_15T16_52_40.545415", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-15T16-52-40.545415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-15T16-52-40.545415.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_15T16_52_40.545415", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-15T16-52-40.545415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-15T16-52-40.545415.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_15T16_52_40.545415", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-15T16-52-40.545415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-15T16-52-40.545415.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_15T16_52_40.545415", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T16-52-40.545415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T16-52-40.545415.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_15T16_52_40.545415", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-15T16-52-40.545415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-15T16-52-40.545415.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_15T16_52_40.545415", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T16-52-40.545415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T16-52-40.545415.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_15T16_52_40.545415", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T16-52-40.545415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T16-52-40.545415.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_15T16_52_40.545415", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-15T16-52-40.545415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-15T16-52-40.545415.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_15T16_52_40.545415", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-15T16-52-40.545415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-15T16-52-40.545415.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_15T16_52_40.545415", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-15T16-52-40.545415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-15T16-52-40.545415.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_15T16_52_40.545415", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T16-52-40.545415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T16-52-40.545415.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_15T16_52_40.545415", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-15T16-52-40.545415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-15T16-52-40.545415.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_15T16_52_40.545415", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-15T16-52-40.545415.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-15T16-52-40.545415.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_15T16_52_40.545415", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-15T16-52-40.545415.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-15T16-52-40.545415.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_15T16_52_40.545415", "path": ["**/details_harness|winogrande|5_2024-02-15T16-52-40.545415.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-15T16-52-40.545415.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_15T16_52_40.545415", "path": ["results_2024-02-15T16-52-40.545415.parquet"]}, {"split": "latest", "path": ["results_2024-02-15T16-52-40.545415.parquet"]}]}]} | 2024-02-15T16:55:27+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of RaduGabriel/SirUkrainian
Dataset automatically created during the evaluation run of model RaduGabriel/SirUkrainian on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-15T16:52:40.545415(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of RaduGabriel/SirUkrainian\n\n\n\nDataset automatically created during the evaluation run of model RaduGabriel/SirUkrainian on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-15T16:52:40.545415(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of RaduGabriel/SirUkrainian\n\n\n\nDataset automatically created during the evaluation run of model RaduGabriel/SirUkrainian on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-15T16:52:40.545415(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
f44e90cf1664ac2573d8a1196931215be5d719cb |
# Dataset Card for Evaluation run of uukuguy/speechless-thoughts-mistral-7b-v1.0
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [uukuguy/speechless-thoughts-mistral-7b-v1.0](https://huggingface.co/uukuguy/speechless-thoughts-mistral-7b-v1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_uukuguy__speechless-thoughts-mistral-7b-v1.0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-15T17:05:22.470377](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__speechless-thoughts-mistral-7b-v1.0/blob/main/results_2024-02-15T17-05-22.470377.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5514023677675722,
"acc_stderr": 0.03404989707789746,
"acc_norm": 0.5546017538931538,
"acc_norm_stderr": 0.034766375621007026,
"mc1": 0.33047735618115054,
"mc1_stderr": 0.0164667696136983,
"mc2": 0.4808579181201282,
"mc2_stderr": 0.014886882686201535
},
"harness|arc:challenge|25": {
"acc": 0.5622866894197952,
"acc_stderr": 0.014497573881108288,
"acc_norm": 0.5853242320819113,
"acc_norm_stderr": 0.014397070564409174
},
"harness|hellaswag|10": {
"acc": 0.6164110734913364,
"acc_stderr": 0.004852658876775388,
"acc_norm": 0.8124875522804222,
"acc_norm_stderr": 0.0038952463204527674
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5037037037037037,
"acc_stderr": 0.043192236258113324,
"acc_norm": 0.5037037037037037,
"acc_norm_stderr": 0.043192236258113324
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5986842105263158,
"acc_stderr": 0.039889037033362836,
"acc_norm": 0.5986842105263158,
"acc_norm_stderr": 0.039889037033362836
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5773584905660377,
"acc_stderr": 0.030402331445769537,
"acc_norm": 0.5773584905660377,
"acc_norm_stderr": 0.030402331445769537
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6041666666666666,
"acc_stderr": 0.04089465449325582,
"acc_norm": 0.6041666666666666,
"acc_norm_stderr": 0.04089465449325582
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5491329479768786,
"acc_stderr": 0.03794012674697031,
"acc_norm": 0.5491329479768786,
"acc_norm_stderr": 0.03794012674697031
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.048580835742663434,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.048580835742663434
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4723404255319149,
"acc_stderr": 0.03263597118409769,
"acc_norm": 0.4723404255319149,
"acc_norm_stderr": 0.03263597118409769
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.40350877192982454,
"acc_stderr": 0.046151869625837026,
"acc_norm": 0.40350877192982454,
"acc_norm_stderr": 0.046151869625837026
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.45517241379310347,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.45517241379310347,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3306878306878307,
"acc_stderr": 0.02422996529842509,
"acc_norm": 0.3306878306878307,
"acc_norm_stderr": 0.02422996529842509
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6419354838709678,
"acc_stderr": 0.02727389059430064,
"acc_norm": 0.6419354838709678,
"acc_norm_stderr": 0.02727389059430064
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.458128078817734,
"acc_stderr": 0.03505630140785741,
"acc_norm": 0.458128078817734,
"acc_norm_stderr": 0.03505630140785741
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.03756335775187897,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.03756335775187897
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.030088629490217487,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.030088629490217487
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7668393782383419,
"acc_stderr": 0.03051611137147601,
"acc_norm": 0.7668393782383419,
"acc_norm_stderr": 0.03051611137147601
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5435897435897435,
"acc_stderr": 0.025254485424799605,
"acc_norm": 0.5435897435897435,
"acc_norm_stderr": 0.025254485424799605
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.027420019350945277,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.027420019350945277
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.592436974789916,
"acc_stderr": 0.031918633744784645,
"acc_norm": 0.592436974789916,
"acc_norm_stderr": 0.031918633744784645
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7724770642201835,
"acc_stderr": 0.017974463578776502,
"acc_norm": 0.7724770642201835,
"acc_norm_stderr": 0.017974463578776502
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.03407632093854052,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.03407632093854052
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.03166009679399812,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.03166009679399812
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6244725738396625,
"acc_stderr": 0.03152256243091156,
"acc_norm": 0.6244725738396625,
"acc_norm_stderr": 0.03152256243091156
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5336322869955157,
"acc_stderr": 0.033481800170603065,
"acc_norm": 0.5336322869955157,
"acc_norm_stderr": 0.033481800170603065
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6259541984732825,
"acc_stderr": 0.042438692422305246,
"acc_norm": 0.6259541984732825,
"acc_norm_stderr": 0.042438692422305246
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6942148760330579,
"acc_stderr": 0.04205953933884124,
"acc_norm": 0.6942148760330579,
"acc_norm_stderr": 0.04205953933884124
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6481481481481481,
"acc_stderr": 0.046166311118017125,
"acc_norm": 0.6481481481481481,
"acc_norm_stderr": 0.046166311118017125
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6809815950920245,
"acc_stderr": 0.03661997551073836,
"acc_norm": 0.6809815950920245,
"acc_norm_stderr": 0.03661997551073836
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3482142857142857,
"acc_stderr": 0.04521829902833587,
"acc_norm": 0.3482142857142857,
"acc_norm_stderr": 0.04521829902833587
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.04354631077260595,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.04354631077260595
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7478632478632479,
"acc_stderr": 0.02844796547623102,
"acc_norm": 0.7478632478632479,
"acc_norm_stderr": 0.02844796547623102
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7126436781609196,
"acc_stderr": 0.0161824107306827,
"acc_norm": 0.7126436781609196,
"acc_norm_stderr": 0.0161824107306827
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5895953757225434,
"acc_stderr": 0.026483392042098177,
"acc_norm": 0.5895953757225434,
"acc_norm_stderr": 0.026483392042098177
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2782122905027933,
"acc_stderr": 0.014987325439963534,
"acc_norm": 0.2782122905027933,
"acc_norm_stderr": 0.014987325439963534
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6437908496732027,
"acc_stderr": 0.027420477662629242,
"acc_norm": 0.6437908496732027,
"acc_norm_stderr": 0.027420477662629242
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5787781350482315,
"acc_stderr": 0.028043399858210628,
"acc_norm": 0.5787781350482315,
"acc_norm_stderr": 0.028043399858210628
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6234567901234568,
"acc_stderr": 0.026959344518747787,
"acc_norm": 0.6234567901234568,
"acc_norm_stderr": 0.026959344518747787
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.37943262411347517,
"acc_stderr": 0.028947338851614105,
"acc_norm": 0.37943262411347517,
"acc_norm_stderr": 0.028947338851614105
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.34224250325945244,
"acc_stderr": 0.012117939998705855,
"acc_norm": 0.34224250325945244,
"acc_norm_stderr": 0.012117939998705855
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5955882352941176,
"acc_stderr": 0.02981263070156974,
"acc_norm": 0.5955882352941176,
"acc_norm_stderr": 0.02981263070156974
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5147058823529411,
"acc_stderr": 0.020219083895133924,
"acc_norm": 0.5147058823529411,
"acc_norm_stderr": 0.020219083895133924
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.04607582090719976,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.04607582090719976
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6163265306122448,
"acc_stderr": 0.031130880396235936,
"acc_norm": 0.6163265306122448,
"acc_norm_stderr": 0.031130880396235936
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7512437810945274,
"acc_stderr": 0.030567675938916714,
"acc_norm": 0.7512437810945274,
"acc_norm_stderr": 0.030567675938916714
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932261,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932261
},
"harness|hendrycksTest-virology|5": {
"acc": 0.463855421686747,
"acc_stderr": 0.03882310850890594,
"acc_norm": 0.463855421686747,
"acc_norm_stderr": 0.03882310850890594
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7894736842105263,
"acc_stderr": 0.0312678171466318,
"acc_norm": 0.7894736842105263,
"acc_norm_stderr": 0.0312678171466318
},
"harness|truthfulqa:mc|0": {
"mc1": 0.33047735618115054,
"mc1_stderr": 0.0164667696136983,
"mc2": 0.4808579181201282,
"mc2_stderr": 0.014886882686201535
},
"harness|winogrande|5": {
"acc": 0.7813733228097869,
"acc_stderr": 0.011616198215773229
},
"harness|gsm8k|5": {
"acc": 0.35178165276724793,
"acc_stderr": 0.013153446023536028
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_uukuguy__speechless-thoughts-mistral-7b-v1.0 | [
"region:us"
] | 2024-02-15T17:07:40+00:00 | {"pretty_name": "Evaluation run of uukuguy/speechless-thoughts-mistral-7b-v1.0", "dataset_summary": "Dataset automatically created during the evaluation run of model [uukuguy/speechless-thoughts-mistral-7b-v1.0](https://huggingface.co/uukuguy/speechless-thoughts-mistral-7b-v1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_uukuguy__speechless-thoughts-mistral-7b-v1.0\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-15T17:05:22.470377](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__speechless-thoughts-mistral-7b-v1.0/blob/main/results_2024-02-15T17-05-22.470377.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5514023677675722,\n \"acc_stderr\": 0.03404989707789746,\n \"acc_norm\": 0.5546017538931538,\n \"acc_norm_stderr\": 0.034766375621007026,\n \"mc1\": 0.33047735618115054,\n \"mc1_stderr\": 0.0164667696136983,\n \"mc2\": 0.4808579181201282,\n \"mc2_stderr\": 0.014886882686201535\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5622866894197952,\n \"acc_stderr\": 0.014497573881108288,\n \"acc_norm\": 0.5853242320819113,\n \"acc_norm_stderr\": 0.014397070564409174\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6164110734913364,\n \"acc_stderr\": 0.004852658876775388,\n \"acc_norm\": 0.8124875522804222,\n \"acc_norm_stderr\": 0.0038952463204527674\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5037037037037037,\n \"acc_stderr\": 0.043192236258113324,\n \"acc_norm\": 0.5037037037037037,\n \"acc_norm_stderr\": 0.043192236258113324\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5986842105263158,\n \"acc_stderr\": 0.039889037033362836,\n \"acc_norm\": 0.5986842105263158,\n \"acc_norm_stderr\": 0.039889037033362836\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5773584905660377,\n \"acc_stderr\": 0.030402331445769537,\n \"acc_norm\": 0.5773584905660377,\n \"acc_norm_stderr\": 0.030402331445769537\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6041666666666666,\n \"acc_stderr\": 0.04089465449325582,\n \"acc_norm\": 0.6041666666666666,\n \"acc_norm_stderr\": 0.04089465449325582\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5491329479768786,\n \"acc_stderr\": 0.03794012674697031,\n \"acc_norm\": 0.5491329479768786,\n \"acc_norm_stderr\": 0.03794012674697031\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663434,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663434\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4723404255319149,\n \"acc_stderr\": 0.03263597118409769,\n \"acc_norm\": 0.4723404255319149,\n \"acc_norm_stderr\": 0.03263597118409769\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.40350877192982454,\n \"acc_stderr\": 0.046151869625837026,\n \"acc_norm\": 0.40350877192982454,\n \"acc_norm_stderr\": 0.046151869625837026\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.45517241379310347,\n \"acc_stderr\": 0.04149886942192117,\n \"acc_norm\": 0.45517241379310347,\n \"acc_norm_stderr\": 0.04149886942192117\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3306878306878307,\n \"acc_stderr\": 0.02422996529842509,\n \"acc_norm\": 0.3306878306878307,\n \"acc_norm_stderr\": 0.02422996529842509\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6419354838709678,\n \"acc_stderr\": 0.02727389059430064,\n \"acc_norm\": 0.6419354838709678,\n \"acc_norm_stderr\": 0.02727389059430064\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.458128078817734,\n \"acc_stderr\": 0.03505630140785741,\n \"acc_norm\": 0.458128078817734,\n \"acc_norm_stderr\": 0.03505630140785741\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.03756335775187897,\n \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.03756335775187897\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217487,\n \"acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217487\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7668393782383419,\n \"acc_stderr\": 0.03051611137147601,\n \"acc_norm\": 0.7668393782383419,\n \"acc_norm_stderr\": 0.03051611137147601\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5435897435897435,\n \"acc_stderr\": 0.025254485424799605,\n \"acc_norm\": 0.5435897435897435,\n \"acc_norm_stderr\": 0.025254485424799605\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2814814814814815,\n \"acc_stderr\": 0.027420019350945277,\n \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.027420019350945277\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.592436974789916,\n \"acc_stderr\": 0.031918633744784645,\n \"acc_norm\": 0.592436974789916,\n \"acc_norm_stderr\": 0.031918633744784645\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7724770642201835,\n \"acc_stderr\": 0.017974463578776502,\n \"acc_norm\": 0.7724770642201835,\n \"acc_norm_stderr\": 0.017974463578776502\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.48148148148148145,\n \"acc_stderr\": 0.03407632093854052,\n \"acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.03407632093854052\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.03166009679399812,\n \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.03166009679399812\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.6244725738396625,\n \"acc_stderr\": 0.03152256243091156,\n \"acc_norm\": 0.6244725738396625,\n \"acc_norm_stderr\": 0.03152256243091156\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5336322869955157,\n \"acc_stderr\": 0.033481800170603065,\n \"acc_norm\": 0.5336322869955157,\n \"acc_norm_stderr\": 0.033481800170603065\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6259541984732825,\n \"acc_stderr\": 0.042438692422305246,\n \"acc_norm\": 0.6259541984732825,\n \"acc_norm_stderr\": 0.042438692422305246\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6942148760330579,\n \"acc_stderr\": 0.04205953933884124,\n \"acc_norm\": 0.6942148760330579,\n \"acc_norm_stderr\": 0.04205953933884124\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6481481481481481,\n \"acc_stderr\": 0.046166311118017125,\n \"acc_norm\": 0.6481481481481481,\n \"acc_norm_stderr\": 0.046166311118017125\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6809815950920245,\n \"acc_stderr\": 0.03661997551073836,\n \"acc_norm\": 0.6809815950920245,\n \"acc_norm_stderr\": 0.03661997551073836\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3482142857142857,\n \"acc_stderr\": 0.04521829902833587,\n \"acc_norm\": 0.3482142857142857,\n \"acc_norm_stderr\": 0.04521829902833587\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260595,\n \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260595\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7478632478632479,\n \"acc_stderr\": 0.02844796547623102,\n \"acc_norm\": 0.7478632478632479,\n \"acc_norm_stderr\": 0.02844796547623102\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7126436781609196,\n \"acc_stderr\": 0.0161824107306827,\n \"acc_norm\": 0.7126436781609196,\n \"acc_norm_stderr\": 0.0161824107306827\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5895953757225434,\n \"acc_stderr\": 0.026483392042098177,\n \"acc_norm\": 0.5895953757225434,\n \"acc_norm_stderr\": 0.026483392042098177\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2782122905027933,\n \"acc_stderr\": 0.014987325439963534,\n \"acc_norm\": 0.2782122905027933,\n \"acc_norm_stderr\": 0.014987325439963534\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6437908496732027,\n \"acc_stderr\": 0.027420477662629242,\n \"acc_norm\": 0.6437908496732027,\n \"acc_norm_stderr\": 0.027420477662629242\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5787781350482315,\n \"acc_stderr\": 0.028043399858210628,\n \"acc_norm\": 0.5787781350482315,\n \"acc_norm_stderr\": 0.028043399858210628\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6234567901234568,\n \"acc_stderr\": 0.026959344518747787,\n \"acc_norm\": 0.6234567901234568,\n \"acc_norm_stderr\": 0.026959344518747787\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.37943262411347517,\n \"acc_stderr\": 0.028947338851614105,\n \"acc_norm\": 0.37943262411347517,\n \"acc_norm_stderr\": 0.028947338851614105\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.34224250325945244,\n \"acc_stderr\": 0.012117939998705855,\n \"acc_norm\": 0.34224250325945244,\n \"acc_norm_stderr\": 0.012117939998705855\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5955882352941176,\n \"acc_stderr\": 0.02981263070156974,\n \"acc_norm\": 0.5955882352941176,\n \"acc_norm_stderr\": 0.02981263070156974\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5147058823529411,\n \"acc_stderr\": 0.020219083895133924,\n \"acc_norm\": 0.5147058823529411,\n \"acc_norm_stderr\": 0.020219083895133924\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6163265306122448,\n \"acc_stderr\": 0.031130880396235936,\n \"acc_norm\": 0.6163265306122448,\n \"acc_norm_stderr\": 0.031130880396235936\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7512437810945274,\n \"acc_stderr\": 0.030567675938916714,\n \"acc_norm\": 0.7512437810945274,\n \"acc_norm_stderr\": 0.030567675938916714\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932261,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932261\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.463855421686747,\n \"acc_stderr\": 0.03882310850890594,\n \"acc_norm\": 0.463855421686747,\n \"acc_norm_stderr\": 0.03882310850890594\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.0312678171466318,\n \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.0312678171466318\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.33047735618115054,\n \"mc1_stderr\": 0.0164667696136983,\n \"mc2\": 0.4808579181201282,\n \"mc2_stderr\": 0.014886882686201535\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7813733228097869,\n \"acc_stderr\": 0.011616198215773229\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.35178165276724793,\n \"acc_stderr\": 0.013153446023536028\n }\n}\n```", "repo_url": "https://huggingface.co/uukuguy/speechless-thoughts-mistral-7b-v1.0", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_15T17_05_22.470377", "path": ["**/details_harness|arc:challenge|25_2024-02-15T17-05-22.470377.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-15T17-05-22.470377.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_15T17_05_22.470377", "path": ["**/details_harness|gsm8k|5_2024-02-15T17-05-22.470377.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-15T17-05-22.470377.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_15T17_05_22.470377", "path": ["**/details_harness|hellaswag|10_2024-02-15T17-05-22.470377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-15T17-05-22.470377.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_15T17_05_22.470377", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-15T17-05-22.470377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-15T17-05-22.470377.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-15T17-05-22.470377.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_15T17_05_22.470377", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T17-05-22.470377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T17-05-22.470377.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_15T17_05_22.470377", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-15T17-05-22.470377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-15T17-05-22.470377.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_15T17_05_22.470377", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-15T17-05-22.470377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-15T17-05-22.470377.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_15T17_05_22.470377", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T17-05-22.470377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T17-05-22.470377.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_15T17_05_22.470377", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T17-05-22.470377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T17-05-22.470377.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_15T17_05_22.470377", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-15T17-05-22.470377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-15T17-05-22.470377.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_15T17_05_22.470377", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T17-05-22.470377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T17-05-22.470377.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_15T17_05_22.470377", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T17-05-22.470377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T17-05-22.470377.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_15T17_05_22.470377", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T17-05-22.470377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T17-05-22.470377.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_15T17_05_22.470377", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T17-05-22.470377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T17-05-22.470377.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_15T17_05_22.470377", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-15T17-05-22.470377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-15T17-05-22.470377.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_15T17_05_22.470377", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-15T17-05-22.470377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-15T17-05-22.470377.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_15T17_05_22.470377", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T17-05-22.470377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T17-05-22.470377.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_15T17_05_22.470377", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-15T17-05-22.470377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-15T17-05-22.470377.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_15T17_05_22.470377", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T17-05-22.470377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T17-05-22.470377.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_15T17_05_22.470377", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T17-05-22.470377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T17-05-22.470377.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_15T17_05_22.470377", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T17-05-22.470377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T17-05-22.470377.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_15T17_05_22.470377", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-15T17-05-22.470377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-15T17-05-22.470377.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_15T17_05_22.470377", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T17-05-22.470377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T17-05-22.470377.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_15T17_05_22.470377", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T17-05-22.470377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T17-05-22.470377.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_15T17_05_22.470377", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T17-05-22.470377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T17-05-22.470377.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_15T17_05_22.470377", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T17-05-22.470377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T17-05-22.470377.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_15T17_05_22.470377", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T17-05-22.470377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T17-05-22.470377.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_15T17_05_22.470377", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T17-05-22.470377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T17-05-22.470377.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_15T17_05_22.470377", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T17-05-22.470377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T17-05-22.470377.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_15T17_05_22.470377", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T17-05-22.470377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T17-05-22.470377.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_15T17_05_22.470377", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T17-05-22.470377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T17-05-22.470377.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_15T17_05_22.470377", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T17-05-22.470377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T17-05-22.470377.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_15T17_05_22.470377", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T17-05-22.470377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T17-05-22.470377.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_15T17_05_22.470377", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T17-05-22.470377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T17-05-22.470377.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_15T17_05_22.470377", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T17-05-22.470377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T17-05-22.470377.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_15T17_05_22.470377", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T17-05-22.470377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T17-05-22.470377.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_15T17_05_22.470377", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-15T17-05-22.470377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-15T17-05-22.470377.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_15T17_05_22.470377", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T17-05-22.470377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T17-05-22.470377.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_15T17_05_22.470377", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-15T17-05-22.470377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-15T17-05-22.470377.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_15T17_05_22.470377", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T17-05-22.470377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T17-05-22.470377.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_15T17_05_22.470377", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T17-05-22.470377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T17-05-22.470377.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_15T17_05_22.470377", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T17-05-22.470377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T17-05-22.470377.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_15T17_05_22.470377", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-15T17-05-22.470377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-15T17-05-22.470377.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_15T17_05_22.470377", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-15T17-05-22.470377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-15T17-05-22.470377.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_15T17_05_22.470377", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T17-05-22.470377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T17-05-22.470377.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_15T17_05_22.470377", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T17-05-22.470377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T17-05-22.470377.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_15T17_05_22.470377", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T17-05-22.470377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T17-05-22.470377.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_15T17_05_22.470377", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T17-05-22.470377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T17-05-22.470377.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_15T17_05_22.470377", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-15T17-05-22.470377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-15T17-05-22.470377.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_15T17_05_22.470377", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-15T17-05-22.470377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-15T17-05-22.470377.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_15T17_05_22.470377", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-15T17-05-22.470377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-15T17-05-22.470377.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_15T17_05_22.470377", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T17-05-22.470377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T17-05-22.470377.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_15T17_05_22.470377", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-15T17-05-22.470377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-15T17-05-22.470377.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_15T17_05_22.470377", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T17-05-22.470377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T17-05-22.470377.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_15T17_05_22.470377", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T17-05-22.470377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T17-05-22.470377.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_15T17_05_22.470377", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-15T17-05-22.470377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-15T17-05-22.470377.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_15T17_05_22.470377", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-15T17-05-22.470377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-15T17-05-22.470377.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_15T17_05_22.470377", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-15T17-05-22.470377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-15T17-05-22.470377.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_15T17_05_22.470377", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T17-05-22.470377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T17-05-22.470377.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_15T17_05_22.470377", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-15T17-05-22.470377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-15T17-05-22.470377.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_15T17_05_22.470377", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-15T17-05-22.470377.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-15T17-05-22.470377.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_15T17_05_22.470377", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-15T17-05-22.470377.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-15T17-05-22.470377.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_15T17_05_22.470377", "path": ["**/details_harness|winogrande|5_2024-02-15T17-05-22.470377.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-15T17-05-22.470377.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_15T17_05_22.470377", "path": ["results_2024-02-15T17-05-22.470377.parquet"]}, {"split": "latest", "path": ["results_2024-02-15T17-05-22.470377.parquet"]}]}]} | 2024-02-15T17:08:07+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of uukuguy/speechless-thoughts-mistral-7b-v1.0
Dataset automatically created during the evaluation run of model uukuguy/speechless-thoughts-mistral-7b-v1.0 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-15T17:05:22.470377(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of uukuguy/speechless-thoughts-mistral-7b-v1.0\n\n\n\nDataset automatically created during the evaluation run of model uukuguy/speechless-thoughts-mistral-7b-v1.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-15T17:05:22.470377(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of uukuguy/speechless-thoughts-mistral-7b-v1.0\n\n\n\nDataset automatically created during the evaluation run of model uukuguy/speechless-thoughts-mistral-7b-v1.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-15T17:05:22.470377(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
93f020b3dc6def5325ec2cbee6de903e38ad74de | # Title Context Pairs
All notebooks at https://github.com/mesolitica/malaysian-dataset/tree/master/embedding/title-context-pair | mesolitica/title-context-pair | [
"language:ms",
"language:en",
"region:us"
] | 2024-02-15T17:27:21+00:00 | {"language": ["ms", "en"]} | 2024-02-17T16:07:14+00:00 | [] | [
"ms",
"en"
] | TAGS
#language-Malay (macrolanguage) #language-English #region-us
| # Title Context Pairs
All notebooks at URL | [
"# Title Context Pairs\n\nAll notebooks at URL"
] | [
"TAGS\n#language-Malay (macrolanguage) #language-English #region-us \n",
"# Title Context Pairs\n\nAll notebooks at URL"
] |
175b303c4ea339f7bbc8649109ae1aa5251c611e | # Dataset Card for "train_ds_noise"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | adityarra07/train_ds_noise | [
"region:us"
] | 2024-02-15T17:57:57+00:00 | {"dataset_info": {"features": [{"name": "audio", "struct": [{"name": "array", "sequence": "float32"}, {"name": "path", "dtype": "null"}, {"name": "sampling_rate", "dtype": "int64"}]}, {"name": "transcription", "dtype": "string"}, {"name": "id", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 5052608063.049213, "num_examples": 22152}, {"name": "test", "num_bytes": 114044060.65026213, "num_examples": 500}], "download_size": 5191539498, "dataset_size": 5166652123.699475}} | 2024-02-15T18:00:49+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "train_ds_noise"
More Information needed | [
"# Dataset Card for \"train_ds_noise\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"train_ds_noise\"\n\nMore Information needed"
] |
08d28a151e2c3fe6dbe4a589aa518dde83105b3b | # Dataset Card for "test_ds_noise"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | adityarra07/test_ds_noise | [
"region:us"
] | 2024-02-15T18:00:49+00:00 | {"dataset_info": {"features": [{"name": "audio", "struct": [{"name": "array", "sequence": "float32"}, {"name": "path", "dtype": "null"}, {"name": "sampling_rate", "dtype": "int64"}]}, {"name": "transcription", "dtype": "string"}, {"name": "id", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 228088121.30052426, "num_examples": 1000}], "download_size": 224454975, "dataset_size": 228088121.30052426}} | 2024-02-15T18:00:58+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "test_ds_noise"
More Information needed | [
"# Dataset Card for \"test_ds_noise\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"test_ds_noise\"\n\nMore Information needed"
] |
58422b850435e4c466fa2481893aa591e85434b6 | # The dataset of the most popular text-to-image prompts.
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** kazimir.ai
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** https://kazimir.ai
- **License:** apache-2.0
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
Free to use.
## Dataset Structure
CSV file columns *name* and *count*.
### Source Data
The prompts from kazimir.ai.
## Dataset Card Contact
[email protected] | Kazimir-ai/text-to-image-prompts | [
"size_categories:1K<n<10K",
"language:en",
"license:apache-2.0",
"prompts",
"text-to-image",
"stable diffusion",
"region:us"
] | 2024-02-15T18:34:13+00:00 | {"language": ["en"], "license": "apache-2.0", "size_categories": ["1K<n<10K"], "pretty_name": "The dataset of the most popular text-to-image prompts", "tags": ["prompts", "text-to-image", "stable diffusion"]} | 2024-02-15T18:42:42+00:00 | [] | [
"en"
] | TAGS
#size_categories-1K<n<10K #language-English #license-apache-2.0 #prompts #text-to-image #stable diffusion #region-us
| # The dataset of the most popular text-to-image prompts.
## Dataset Details
### Dataset Description
- Curated by: URL
- Funded by [optional]:
- Shared by [optional]: URL
- License: apache-2.0
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
Free to use.
## Dataset Structure
CSV file columns *name* and *count*.
### Source Data
The prompts from URL.
## Dataset Card Contact
data@URL | [
"# The dataset of the most popular text-to-image prompts.",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: URL\n- Funded by [optional]: \n- Shared by [optional]: URL\n- License: apache-2.0",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses\n\nFree to use.",
"## Dataset Structure\n\nCSV file columns *name* and *count*.",
"### Source Data\n\nThe prompts from URL.",
"## Dataset Card Contact\n\ndata@URL"
] | [
"TAGS\n#size_categories-1K<n<10K #language-English #license-apache-2.0 #prompts #text-to-image #stable diffusion #region-us \n",
"# The dataset of the most popular text-to-image prompts.",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: URL\n- Funded by [optional]: \n- Shared by [optional]: URL\n- License: apache-2.0",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses\n\nFree to use.",
"## Dataset Structure\n\nCSV file columns *name* and *count*.",
"### Source Data\n\nThe prompts from URL.",
"## Dataset Card Contact\n\ndata@URL"
] |
c9f8cfe147d7bde5bb3664f96471a56659b4222a |
# Dataset Card for Evaluation run of CorticalStack/mistral-7b-alpaca-sft
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [CorticalStack/mistral-7b-alpaca-sft](https://huggingface.co/CorticalStack/mistral-7b-alpaca-sft) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CorticalStack__mistral-7b-alpaca-sft",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-15T19:16:07.365309](https://huggingface.co/datasets/open-llm-leaderboard/details_CorticalStack__mistral-7b-alpaca-sft/blob/main/results_2024-02-15T19-16-07.365309.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6146116714025754,
"acc_stderr": 0.03286709487834751,
"acc_norm": 0.6202418665578125,
"acc_norm_stderr": 0.03353487447545581,
"mc1": 0.3769889840881273,
"mc1_stderr": 0.016965517578930354,
"mc2": 0.5359107243003883,
"mc2_stderr": 0.014857832315965628
},
"harness|arc:challenge|25": {
"acc": 0.5819112627986348,
"acc_stderr": 0.01441398839699608,
"acc_norm": 0.6168941979522184,
"acc_norm_stderr": 0.014206472661672876
},
"harness|hellaswag|10": {
"acc": 0.634833698466441,
"acc_stderr": 0.004804927608773126,
"acc_norm": 0.8355905198167696,
"acc_norm_stderr": 0.0036988923883801024
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6578947368421053,
"acc_stderr": 0.03860731599316092,
"acc_norm": 0.6578947368421053,
"acc_norm_stderr": 0.03860731599316092
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.028450154794118637,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.028450154794118637
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.03852084696008534,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.03852084696008534
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.03703851193099521,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.03703851193099521
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082635,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082635
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.04677473004491199,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.04677473004491199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.02519710107424648,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.02519710107424648
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7322580645161291,
"acc_stderr": 0.02518900666021238,
"acc_norm": 0.7322580645161291,
"acc_norm_stderr": 0.02518900666021238
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7212121212121212,
"acc_stderr": 0.035014387062967806,
"acc_norm": 0.7212121212121212,
"acc_norm_stderr": 0.035014387062967806
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.029376616484945627,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.029376616484945627
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8549222797927462,
"acc_stderr": 0.025416343096306433,
"acc_norm": 0.8549222797927462,
"acc_norm_stderr": 0.025416343096306433
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6205128205128205,
"acc_stderr": 0.024603626924097417,
"acc_norm": 0.6205128205128205,
"acc_norm_stderr": 0.024603626924097417
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.028226446749683515,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.028226446749683515
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6050420168067226,
"acc_stderr": 0.031753678460966245,
"acc_norm": 0.6050420168067226,
"acc_norm_stderr": 0.031753678460966245
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3973509933774834,
"acc_stderr": 0.0399552400768168,
"acc_norm": 0.3973509933774834,
"acc_norm_stderr": 0.0399552400768168
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7981651376146789,
"acc_stderr": 0.01720857935778758,
"acc_norm": 0.7981651376146789,
"acc_norm_stderr": 0.01720857935778758
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7745098039215687,
"acc_stderr": 0.02933116229425174,
"acc_norm": 0.7745098039215687,
"acc_norm_stderr": 0.02933116229425174
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7510548523206751,
"acc_stderr": 0.028146970599422644,
"acc_norm": 0.7510548523206751,
"acc_norm_stderr": 0.028146970599422644
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6502242152466368,
"acc_stderr": 0.03200736719484503,
"acc_norm": 0.6502242152466368,
"acc_norm_stderr": 0.03200736719484503
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306085,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306085
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7239263803680982,
"acc_stderr": 0.035123852837050475,
"acc_norm": 0.7239263803680982,
"acc_norm_stderr": 0.035123852837050475
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.04726835553719099,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.04726835553719099
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281382,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281382
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8045977011494253,
"acc_stderr": 0.014179171373424384,
"acc_norm": 0.8045977011494253,
"acc_norm_stderr": 0.014179171373424384
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6820809248554913,
"acc_stderr": 0.02507071371915319,
"acc_norm": 0.6820809248554913,
"acc_norm_stderr": 0.02507071371915319
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.25921787709497207,
"acc_stderr": 0.014655780837497733,
"acc_norm": 0.25921787709497207,
"acc_norm_stderr": 0.014655780837497733
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.025553169991826524,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.025553169991826524
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6816720257234726,
"acc_stderr": 0.02645722506781103,
"acc_norm": 0.6816720257234726,
"acc_norm_stderr": 0.02645722506781103
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.02517104191530968,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.02517104191530968
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.450354609929078,
"acc_stderr": 0.029680105565029036,
"acc_norm": 0.450354609929078,
"acc_norm_stderr": 0.029680105565029036
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4335071707953064,
"acc_stderr": 0.012656810383983967,
"acc_norm": 0.4335071707953064,
"acc_norm_stderr": 0.012656810383983967
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6580882352941176,
"acc_stderr": 0.028814722422254187,
"acc_norm": 0.6580882352941176,
"acc_norm_stderr": 0.028814722422254187
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6209150326797386,
"acc_stderr": 0.019627444748412236,
"acc_norm": 0.6209150326797386,
"acc_norm_stderr": 0.019627444748412236
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6272727272727273,
"acc_stderr": 0.04631381319425465,
"acc_norm": 0.6272727272727273,
"acc_norm_stderr": 0.04631381319425465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6938775510204082,
"acc_stderr": 0.02950489645459596,
"acc_norm": 0.6938775510204082,
"acc_norm_stderr": 0.02950489645459596
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.02519692987482708,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.02519692987482708
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7894736842105263,
"acc_stderr": 0.03126781714663179,
"acc_norm": 0.7894736842105263,
"acc_norm_stderr": 0.03126781714663179
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3769889840881273,
"mc1_stderr": 0.016965517578930354,
"mc2": 0.5359107243003883,
"mc2_stderr": 0.014857832315965628
},
"harness|winogrande|5": {
"acc": 0.7719021310181531,
"acc_stderr": 0.011793015817663595
},
"harness|gsm8k|5": {
"acc": 0.36087945413191813,
"acc_stderr": 0.013228626753925145
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_CorticalStack__mistral-7b-alpaca-sft | [
"region:us"
] | 2024-02-15T19:18:24+00:00 | {"pretty_name": "Evaluation run of CorticalStack/mistral-7b-alpaca-sft", "dataset_summary": "Dataset automatically created during the evaluation run of model [CorticalStack/mistral-7b-alpaca-sft](https://huggingface.co/CorticalStack/mistral-7b-alpaca-sft) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CorticalStack__mistral-7b-alpaca-sft\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-15T19:16:07.365309](https://huggingface.co/datasets/open-llm-leaderboard/details_CorticalStack__mistral-7b-alpaca-sft/blob/main/results_2024-02-15T19-16-07.365309.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6146116714025754,\n \"acc_stderr\": 0.03286709487834751,\n \"acc_norm\": 0.6202418665578125,\n \"acc_norm_stderr\": 0.03353487447545581,\n \"mc1\": 0.3769889840881273,\n \"mc1_stderr\": 0.016965517578930354,\n \"mc2\": 0.5359107243003883,\n \"mc2_stderr\": 0.014857832315965628\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5819112627986348,\n \"acc_stderr\": 0.01441398839699608,\n \"acc_norm\": 0.6168941979522184,\n \"acc_norm_stderr\": 0.014206472661672876\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.634833698466441,\n \"acc_stderr\": 0.004804927608773126,\n \"acc_norm\": 0.8355905198167696,\n \"acc_norm_stderr\": 0.0036988923883801024\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6578947368421053,\n \"acc_stderr\": 0.03860731599316092,\n \"acc_norm\": 0.6578947368421053,\n \"acc_norm_stderr\": 0.03860731599316092\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.028450154794118637,\n \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.028450154794118637\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6944444444444444,\n \"acc_stderr\": 0.03852084696008534,\n \"acc_norm\": 0.6944444444444444,\n \"acc_norm_stderr\": 0.03852084696008534\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n \"acc_stderr\": 0.03703851193099521,\n \"acc_norm\": 0.6184971098265896,\n \"acc_norm_stderr\": 0.03703851193099521\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082635,\n \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082635\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3968253968253968,\n \"acc_stderr\": 0.02519710107424648,\n \"acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.02519710107424648\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7322580645161291,\n \"acc_stderr\": 0.02518900666021238,\n \"acc_norm\": 0.7322580645161291,\n \"acc_norm_stderr\": 0.02518900666021238\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7212121212121212,\n \"acc_stderr\": 0.035014387062967806,\n \"acc_norm\": 0.7212121212121212,\n \"acc_norm_stderr\": 0.035014387062967806\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7828282828282829,\n \"acc_stderr\": 0.029376616484945627,\n \"acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.029376616484945627\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8549222797927462,\n \"acc_stderr\": 0.025416343096306433,\n \"acc_norm\": 0.8549222797927462,\n \"acc_norm_stderr\": 0.025416343096306433\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6205128205128205,\n \"acc_stderr\": 0.024603626924097417,\n \"acc_norm\": 0.6205128205128205,\n \"acc_norm_stderr\": 0.024603626924097417\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3111111111111111,\n \"acc_stderr\": 0.028226446749683515,\n \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.028226446749683515\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6050420168067226,\n \"acc_stderr\": 0.031753678460966245,\n \"acc_norm\": 0.6050420168067226,\n \"acc_norm_stderr\": 0.031753678460966245\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3973509933774834,\n \"acc_stderr\": 0.0399552400768168,\n \"acc_norm\": 0.3973509933774834,\n \"acc_norm_stderr\": 0.0399552400768168\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7981651376146789,\n \"acc_stderr\": 0.01720857935778758,\n \"acc_norm\": 0.7981651376146789,\n \"acc_norm_stderr\": 0.01720857935778758\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7745098039215687,\n \"acc_stderr\": 0.02933116229425174,\n \"acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.02933116229425174\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7510548523206751,\n \"acc_stderr\": 0.028146970599422644,\n \"acc_norm\": 0.7510548523206751,\n \"acc_norm_stderr\": 0.028146970599422644\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6502242152466368,\n \"acc_stderr\": 0.03200736719484503,\n \"acc_norm\": 0.6502242152466368,\n \"acc_norm_stderr\": 0.03200736719484503\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306085,\n \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306085\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7239263803680982,\n \"acc_stderr\": 0.035123852837050475,\n \"acc_norm\": 0.7239263803680982,\n \"acc_norm_stderr\": 0.035123852837050475\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n \"acc_stderr\": 0.04726835553719099,\n \"acc_norm\": 0.45535714285714285,\n \"acc_norm_stderr\": 0.04726835553719099\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n \"acc_stderr\": 0.021586494001281382,\n \"acc_norm\": 0.8760683760683761,\n \"acc_norm_stderr\": 0.021586494001281382\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8045977011494253,\n \"acc_stderr\": 0.014179171373424384,\n \"acc_norm\": 0.8045977011494253,\n \"acc_norm_stderr\": 0.014179171373424384\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6820809248554913,\n \"acc_stderr\": 0.02507071371915319,\n \"acc_norm\": 0.6820809248554913,\n \"acc_norm_stderr\": 0.02507071371915319\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.25921787709497207,\n \"acc_stderr\": 0.014655780837497733,\n \"acc_norm\": 0.25921787709497207,\n \"acc_norm_stderr\": 0.014655780837497733\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826524,\n \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826524\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6816720257234726,\n \"acc_stderr\": 0.02645722506781103,\n \"acc_norm\": 0.6816720257234726,\n \"acc_norm_stderr\": 0.02645722506781103\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7129629629629629,\n \"acc_stderr\": 0.02517104191530968,\n \"acc_norm\": 0.7129629629629629,\n \"acc_norm_stderr\": 0.02517104191530968\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.450354609929078,\n \"acc_stderr\": 0.029680105565029036,\n \"acc_norm\": 0.450354609929078,\n \"acc_norm_stderr\": 0.029680105565029036\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4335071707953064,\n \"acc_stderr\": 0.012656810383983967,\n \"acc_norm\": 0.4335071707953064,\n \"acc_norm_stderr\": 0.012656810383983967\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6580882352941176,\n \"acc_stderr\": 0.028814722422254187,\n \"acc_norm\": 0.6580882352941176,\n \"acc_norm_stderr\": 0.028814722422254187\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6209150326797386,\n \"acc_stderr\": 0.019627444748412236,\n \"acc_norm\": 0.6209150326797386,\n \"acc_norm_stderr\": 0.019627444748412236\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n \"acc_stderr\": 0.04631381319425465,\n \"acc_norm\": 0.6272727272727273,\n \"acc_norm_stderr\": 0.04631381319425465\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6938775510204082,\n \"acc_stderr\": 0.02950489645459596,\n \"acc_norm\": 0.6938775510204082,\n \"acc_norm_stderr\": 0.02950489645459596\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n \"acc_stderr\": 0.02519692987482708,\n \"acc_norm\": 0.8507462686567164,\n \"acc_norm_stderr\": 0.02519692987482708\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.03126781714663179,\n \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.03126781714663179\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3769889840881273,\n \"mc1_stderr\": 0.016965517578930354,\n \"mc2\": 0.5359107243003883,\n \"mc2_stderr\": 0.014857832315965628\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7719021310181531,\n \"acc_stderr\": 0.011793015817663595\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.36087945413191813,\n \"acc_stderr\": 0.013228626753925145\n }\n}\n```", "repo_url": "https://huggingface.co/CorticalStack/mistral-7b-alpaca-sft", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_15T19_16_07.365309", "path": ["**/details_harness|arc:challenge|25_2024-02-15T19-16-07.365309.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-15T19-16-07.365309.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_15T19_16_07.365309", "path": ["**/details_harness|gsm8k|5_2024-02-15T19-16-07.365309.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-15T19-16-07.365309.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_15T19_16_07.365309", "path": ["**/details_harness|hellaswag|10_2024-02-15T19-16-07.365309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-15T19-16-07.365309.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_15T19_16_07.365309", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-15T19-16-07.365309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-15T19-16-07.365309.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-15T19-16-07.365309.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_15T19_16_07.365309", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T19-16-07.365309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T19-16-07.365309.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_15T19_16_07.365309", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-15T19-16-07.365309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-15T19-16-07.365309.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_15T19_16_07.365309", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-15T19-16-07.365309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-15T19-16-07.365309.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_15T19_16_07.365309", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T19-16-07.365309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T19-16-07.365309.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_15T19_16_07.365309", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T19-16-07.365309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T19-16-07.365309.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_15T19_16_07.365309", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-15T19-16-07.365309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-15T19-16-07.365309.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_15T19_16_07.365309", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T19-16-07.365309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T19-16-07.365309.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_15T19_16_07.365309", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T19-16-07.365309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T19-16-07.365309.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_15T19_16_07.365309", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T19-16-07.365309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T19-16-07.365309.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_15T19_16_07.365309", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T19-16-07.365309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T19-16-07.365309.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_15T19_16_07.365309", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-15T19-16-07.365309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-15T19-16-07.365309.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_15T19_16_07.365309", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-15T19-16-07.365309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-15T19-16-07.365309.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_15T19_16_07.365309", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T19-16-07.365309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T19-16-07.365309.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_15T19_16_07.365309", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-15T19-16-07.365309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-15T19-16-07.365309.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_15T19_16_07.365309", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T19-16-07.365309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T19-16-07.365309.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_15T19_16_07.365309", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T19-16-07.365309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T19-16-07.365309.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_15T19_16_07.365309", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T19-16-07.365309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T19-16-07.365309.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_15T19_16_07.365309", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-15T19-16-07.365309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-15T19-16-07.365309.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_15T19_16_07.365309", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T19-16-07.365309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T19-16-07.365309.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_15T19_16_07.365309", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T19-16-07.365309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T19-16-07.365309.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_15T19_16_07.365309", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T19-16-07.365309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T19-16-07.365309.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_15T19_16_07.365309", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T19-16-07.365309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T19-16-07.365309.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_15T19_16_07.365309", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T19-16-07.365309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T19-16-07.365309.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_15T19_16_07.365309", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T19-16-07.365309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T19-16-07.365309.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_15T19_16_07.365309", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T19-16-07.365309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T19-16-07.365309.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_15T19_16_07.365309", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T19-16-07.365309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T19-16-07.365309.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_15T19_16_07.365309", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T19-16-07.365309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T19-16-07.365309.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_15T19_16_07.365309", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T19-16-07.365309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T19-16-07.365309.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_15T19_16_07.365309", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T19-16-07.365309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T19-16-07.365309.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_15T19_16_07.365309", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T19-16-07.365309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T19-16-07.365309.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_15T19_16_07.365309", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T19-16-07.365309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T19-16-07.365309.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_15T19_16_07.365309", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T19-16-07.365309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T19-16-07.365309.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_15T19_16_07.365309", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-15T19-16-07.365309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-15T19-16-07.365309.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_15T19_16_07.365309", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T19-16-07.365309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T19-16-07.365309.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_15T19_16_07.365309", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-15T19-16-07.365309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-15T19-16-07.365309.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_15T19_16_07.365309", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T19-16-07.365309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T19-16-07.365309.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_15T19_16_07.365309", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T19-16-07.365309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T19-16-07.365309.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_15T19_16_07.365309", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T19-16-07.365309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T19-16-07.365309.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_15T19_16_07.365309", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-15T19-16-07.365309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-15T19-16-07.365309.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_15T19_16_07.365309", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-15T19-16-07.365309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-15T19-16-07.365309.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_15T19_16_07.365309", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T19-16-07.365309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T19-16-07.365309.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_15T19_16_07.365309", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T19-16-07.365309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T19-16-07.365309.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_15T19_16_07.365309", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T19-16-07.365309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T19-16-07.365309.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_15T19_16_07.365309", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T19-16-07.365309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T19-16-07.365309.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_15T19_16_07.365309", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-15T19-16-07.365309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-15T19-16-07.365309.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_15T19_16_07.365309", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-15T19-16-07.365309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-15T19-16-07.365309.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_15T19_16_07.365309", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-15T19-16-07.365309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-15T19-16-07.365309.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_15T19_16_07.365309", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T19-16-07.365309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T19-16-07.365309.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_15T19_16_07.365309", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-15T19-16-07.365309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-15T19-16-07.365309.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_15T19_16_07.365309", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T19-16-07.365309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T19-16-07.365309.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_15T19_16_07.365309", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T19-16-07.365309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T19-16-07.365309.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_15T19_16_07.365309", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-15T19-16-07.365309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-15T19-16-07.365309.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_15T19_16_07.365309", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-15T19-16-07.365309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-15T19-16-07.365309.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_15T19_16_07.365309", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-15T19-16-07.365309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-15T19-16-07.365309.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_15T19_16_07.365309", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T19-16-07.365309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T19-16-07.365309.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_15T19_16_07.365309", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-15T19-16-07.365309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-15T19-16-07.365309.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_15T19_16_07.365309", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-15T19-16-07.365309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-15T19-16-07.365309.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_15T19_16_07.365309", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-15T19-16-07.365309.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-15T19-16-07.365309.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_15T19_16_07.365309", "path": ["**/details_harness|winogrande|5_2024-02-15T19-16-07.365309.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-15T19-16-07.365309.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_15T19_16_07.365309", "path": ["results_2024-02-15T19-16-07.365309.parquet"]}, {"split": "latest", "path": ["results_2024-02-15T19-16-07.365309.parquet"]}]}]} | 2024-02-15T19:19:05+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of CorticalStack/mistral-7b-alpaca-sft
Dataset automatically created during the evaluation run of model CorticalStack/mistral-7b-alpaca-sft on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-15T19:16:07.365309(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of CorticalStack/mistral-7b-alpaca-sft\n\n\n\nDataset automatically created during the evaluation run of model CorticalStack/mistral-7b-alpaca-sft on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-15T19:16:07.365309(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of CorticalStack/mistral-7b-alpaca-sft\n\n\n\nDataset automatically created during the evaluation run of model CorticalStack/mistral-7b-alpaca-sft on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-15T19:16:07.365309(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
4e934082931d9c62a233e000b7098aa71c4e214b | # Dataset Card for "chess_world_lichess_elite"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | austindavis/chess_world_lichess_elite | [
"region:us"
] | 2024-02-15T19:27:20+00:00 | {"dataset_info": {"features": [{"name": "Event", "dtype": "string"}, {"name": "Site", "dtype": "string"}, {"name": "Date", "dtype": "string"}, {"name": "Round", "dtype": "string"}, {"name": "White", "dtype": "string"}, {"name": "Black", "dtype": "string"}, {"name": "Result", "dtype": "string"}, {"name": "ECO", "dtype": "string"}, {"name": "WhiteElo", "dtype": "int64"}, {"name": "BlackElo", "dtype": "int64"}, {"name": "PlyCount", "dtype": "int64"}, {"name": "EventDate", "dtype": "string"}, {"name": "EventType", "dtype": "string"}, {"name": "transcript", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 157006085, "num_examples": 234048}], "download_size": 78928248, "dataset_size": 157006085}} | 2024-02-15T20:06:14+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "chess_world_lichess_elite"
More Information needed | [
"# Dataset Card for \"chess_world_lichess_elite\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"chess_world_lichess_elite\"\n\nMore Information needed"
] |
6c3562e0bce1c854581300abe98c272508859622 | # Dataset Card for Dataset Name
UA-CBT is a dataset inspired by Children's Book Test (https://arxiv.org/abs/1511.02301) containing machine-generated (and human-corrected) stories with gaps, and multiple possible options for words to fill the gaps.
The text is in Ukrainian language, and the options have been inflected to grammatically match the original ones. Gaps are of three types: named entities, common nouns, and verbs.
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
- Serhii Hamotskyi
- **TODO**
- Mariia Tkachenko
- Daria Kravets
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | shamotskyi/ua_cbt | [
"language:uk",
"license:cc-by-nc-4.0",
"arxiv:1511.02301",
"region:us"
] | 2024-02-15T19:30:34+00:00 | {"language": ["uk"], "license": "cc-by-nc-4.0"} | 2024-02-15T19:50:59+00:00 | [
"1511.02301"
] | [
"uk"
] | TAGS
#language-Ukrainian #license-cc-by-nc-4.0 #arxiv-1511.02301 #region-us
| # Dataset Card for Dataset Name
UA-CBT is a dataset inspired by Children's Book Test (URL containing machine-generated (and human-corrected) stories with gaps, and multiple possible options for words to fill the gaps.
The text is in Ukrainian language, and the options have been inflected to grammatically match the original ones. Gaps are of three types: named entities, common nouns, and verbs.
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
- Serhii Hamotskyi
- TODO
- Mariia Tkachenko
- Daria Kravets
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Dataset Name\n\nUA-CBT is a dataset inspired by Children's Book Test (URL containing machine-generated (and human-corrected) stories with gaps, and multiple possible options for words to fill the gaps.\n\nThe text is in Ukrainian language, and the options have been inflected to grammatically match the original ones. Gaps are of three types: named entities, common nouns, and verbs.",
"## Dataset Details",
"### Dataset Description\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?\n\n- Serhii Hamotskyi\n- TODO\n- Mariia Tkachenko\n- Daria Kravets",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#language-Ukrainian #license-cc-by-nc-4.0 #arxiv-1511.02301 #region-us \n",
"# Dataset Card for Dataset Name\n\nUA-CBT is a dataset inspired by Children's Book Test (URL containing machine-generated (and human-corrected) stories with gaps, and multiple possible options for words to fill the gaps.\n\nThe text is in Ukrainian language, and the options have been inflected to grammatically match the original ones. Gaps are of three types: named entities, common nouns, and verbs.",
"## Dataset Details",
"### Dataset Description\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?\n\n- Serhii Hamotskyi\n- TODO\n- Mariia Tkachenko\n- Daria Kravets",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
dc8f9855d16f3344cf7233bfed6d8d5e41b09968 | # Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: autoevaluate/summarization-not-evaluated
* Dataset: autoevaluate/xsum-sample
* Config: autoevaluate--xsum-sample
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. | autoevaluate/autoeval-staging-eval-autoevaluate__xsum-sample-autoevaluate__xsum-sample-437a8a-17406356 | [
"autotrain",
"evaluation",
"region:us"
] | 2022-12-05T20:08:55+00:00 | {"type": "predictions", "tags": ["autotrain", "evaluation"], "datasets": ["autoevaluate/xsum-sample"], "eval_info": {"task": "summarization", "model": "autoevaluate/summarization-not-evaluated", "metrics": [], "dataset_name": "autoevaluate/xsum-sample", "dataset_config": "autoevaluate--xsum-sample", "dataset_split": "test", "col_mapping": {"text": "document", "target": "summary"}}} | 2022-12-05T20:09:19+00:00 | [] | [] | TAGS
#autotrain #evaluation #region-us
| # Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by AutoTrain for the following task and dataset:
* Task: Summarization
* Model: autoevaluate/summarization-not-evaluated
* Dataset: autoevaluate/xsum-sample
* Config: autoevaluate--xsum-sample
* Split: test
To run new evaluation jobs, visit Hugging Face's automatic model evaluator.
## Contributions
Thanks to @lewtun for evaluating this model. | [
"# Dataset Card for AutoTrain Evaluator\n\nThis repository contains model predictions generated by AutoTrain for the following task and dataset:\n\n* Task: Summarization\n* Model: autoevaluate/summarization-not-evaluated\n* Dataset: autoevaluate/xsum-sample\n* Config: autoevaluate--xsum-sample\n* Split: test\n\nTo run new evaluation jobs, visit Hugging Face's automatic model evaluator.",
"## Contributions\n\nThanks to @lewtun for evaluating this model."
] | [
"TAGS\n#autotrain #evaluation #region-us \n",
"# Dataset Card for AutoTrain Evaluator\n\nThis repository contains model predictions generated by AutoTrain for the following task and dataset:\n\n* Task: Summarization\n* Model: autoevaluate/summarization-not-evaluated\n* Dataset: autoevaluate/xsum-sample\n* Config: autoevaluate--xsum-sample\n* Split: test\n\nTo run new evaluation jobs, visit Hugging Face's automatic model evaluator.",
"## Contributions\n\nThanks to @lewtun for evaluating this model."
] |
0414473221310c4d7208d9972dac3165025946b6 |
# Dataset of tenkyuu_chimata/텐큐치마타 (Touhou)
This is the dataset of tenkyuu_chimata/텐큐치마타 (Touhou), containing 500 images and their tags.
The core tags of this character are `short_hair, multicolored_hairband, blue_hair, hairband, blue_eyes, bangs, purple_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 588.11 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tenkyuu_chimata_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 330.66 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tenkyuu_chimata_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1162 | 694.67 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tenkyuu_chimata_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 521.12 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tenkyuu_chimata_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1162 | 973.14 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tenkyuu_chimata_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/tenkyuu_chimata_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 17 |  |  |  |  |  | 1girl, boots, multicolored_dress, patchwork_clothes, rainbow_gradient, sky_print, solo, full_body, hair_between_eyes, long_sleeves, white_cloak, purple_footwear, blush, open_mouth, ahoge, cape, :d, pink_footwear |
| 1 | 24 |  |  |  |  |  | 1girl, long_sleeves, multicolored_dress, patchwork_clothes, rainbow_gradient, solo, white_cape, open_mouth, sky_print, looking_at_viewer, pointing_up, zipper, two-sided_cape, :d, index_finger_raised, blush |
| 2 | 7 |  |  |  |  |  | 1girl, closed_mouth, multicolored_dress, patchwork_clothes, rainbow_gradient, simple_background, solo, looking_at_viewer, upper_body, white_cape, long_sleeves, white_background, white_cloak |
| 3 | 6 |  |  |  |  |  | 1girl, looking_at_viewer, rainbow_gradient, simple_background, solo, white_background, hair_between_eyes, portrait, purple_eyes, smile, blush, closed_mouth |
| 4 | 5 |  |  |  |  |  | 1girl, blue_dress, blue_hairband, blue_sky, brown_belt, buttons, cloud_print, cloudy_sky, collar, fire, green_dress, green_hairband, hair_between_eyes, hand_up, looking_to_the_side, multicolored_dress, orange_dress, orange_hairband, orange_sleeves, pink_dress, pink_hairband, puffy_long_sleeves, purple_dress, purple_eyes, purple_hairband, rainbow_gradient, red_hairband, red_sleeves, sky_print, smile, solo, white_cloak, yellow_bag, yellow_dress, yellow_hairband, arm_up, brown_bag, medium_breasts, multicolored_background, blush, boots, looking_at_viewer, open_mouth, pointing, standing, yellow_background, zipper, card, closed_mouth, flying, gradient_background, light, teeth |
| 5 | 5 |  |  |  |  |  | 1girl, blue_dress, blue_hairband, boots, cloudy_sky, full_body, green_dress, green_hairband, hair_between_eyes, multicolored_dress, orange_dress, orange_sleeves, pink_footwear, pink_hairband, purple_dress, purple_eyes, purple_hairband, red_sleeves, sky_print, smile, solo, standing, white_cloak, yellow_dress, yellow_hairband, yellow_sleeves, blue_sky, brown_belt, buttons, cloud_print, looking_to_the_side, orange_hairband, pink_dress, puffy_long_sleeves, rainbow_gradient, simple_background, white_bow, yellow_bag, alphes_(style), arm_up, closed_mouth, fire, footwear_bow, hand_on_hip, hand_up, red_hairband, tachi-e, transparent_background, blush, open_mouth, pointing, smug, white_background, white_collar |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | boots | multicolored_dress | patchwork_clothes | rainbow_gradient | sky_print | solo | full_body | hair_between_eyes | long_sleeves | white_cloak | purple_footwear | blush | open_mouth | ahoge | cape | :d | pink_footwear | white_cape | looking_at_viewer | pointing_up | zipper | two-sided_cape | index_finger_raised | closed_mouth | simple_background | upper_body | white_background | portrait | purple_eyes | smile | blue_dress | blue_hairband | blue_sky | brown_belt | buttons | cloud_print | cloudy_sky | collar | fire | green_dress | green_hairband | hand_up | looking_to_the_side | orange_dress | orange_hairband | orange_sleeves | pink_dress | pink_hairband | puffy_long_sleeves | purple_dress | purple_hairband | red_hairband | red_sleeves | yellow_bag | yellow_dress | yellow_hairband | arm_up | brown_bag | medium_breasts | multicolored_background | pointing | standing | yellow_background | card | flying | gradient_background | light | teeth | yellow_sleeves | white_bow | alphes_(style) | footwear_bow | hand_on_hip | tachi-e | transparent_background | smug | white_collar |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:---------------------|:--------------------|:-------------------|:------------|:-------|:------------|:--------------------|:---------------|:--------------|:------------------|:--------|:-------------|:--------|:-------|:-----|:----------------|:-------------|:--------------------|:--------------|:---------|:-----------------|:----------------------|:---------------|:--------------------|:-------------|:-------------------|:-----------|:--------------|:--------|:-------------|:----------------|:-----------|:-------------|:----------|:--------------|:-------------|:---------|:-------|:--------------|:-----------------|:----------|:----------------------|:---------------|:------------------|:-----------------|:-------------|:----------------|:---------------------|:---------------|:------------------|:---------------|:--------------|:-------------|:---------------|:------------------|:---------|:------------|:-----------------|:--------------------------|:-----------|:-----------|:--------------------|:-------|:---------|:----------------------|:--------|:--------|:-----------------|:------------|:-----------------|:---------------|:--------------|:----------|:-------------------------|:-------|:---------------|
| 0 | 17 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 24 |  |  |  |  |  | X | | X | X | X | X | X | | | X | | | X | X | | | X | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 7 |  |  |  |  |  | X | | X | X | X | | X | | | X | X | | | | | | | | X | X | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 6 |  |  |  |  |  | X | | | | X | | X | | X | | | | X | | | | | | | X | | | | | X | X | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | X | X | | X | X | X | | X | | X | | X | X | | | | | | X | | X | | | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | X | X | | X | X | X | X | X | | X | | X | X | | | | X | | | | | | | X | X | | X | | X | X | X | X | X | X | X | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | X | X | | | | | | | X | X | X | X | X | X | X | X | X |
| CyberHarem/tenkyuu_chimata_touhou | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-15T10:22:16+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-15T12:06:09+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of tenkyuu\_chimata/텐큐치마타 (Touhou)
==========================================
This is the dataset of tenkyuu\_chimata/텐큐치마타 (Touhou), containing 500 images and their tags.
The core tags of this character are 'short\_hair, multicolored\_hairband, blue\_hair, hairband, blue\_eyes, bangs, purple\_hair', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
9c58d0d214dbf367f385f0edeecccde13eeb359a | # Dataset Card for "c_x86_avx2_extension_filtered"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | zhangshuoming/c_x86_avx2_extension_filtered | [
"region:us"
] | 2024-01-15T10:34:51+00:00 | {"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 856916.0, "num_examples": 1101}], "download_size": 129124, "dataset_size": 856916.0}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-16T03:33:53+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "c_x86_avx2_extension_filtered"
More Information needed | [
"# Dataset Card for \"c_x86_avx2_extension_filtered\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"c_x86_avx2_extension_filtered\"\n\nMore Information needed"
] |
e8b6686408c6d9c8e6e8c8184e064ffc81829821 |
# Dataset Card for Evaluation run of wang7776/Mistral-7B-Instruct-v0.2-sparsity-20
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [wang7776/Mistral-7B-Instruct-v0.2-sparsity-20](https://huggingface.co/wang7776/Mistral-7B-Instruct-v0.2-sparsity-20) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_wang7776__Mistral-7B-Instruct-v0.2-sparsity-20",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-15T10:42:05.147679](https://huggingface.co/datasets/open-llm-leaderboard/details_wang7776__Mistral-7B-Instruct-v0.2-sparsity-20/blob/main/results_2024-01-15T10-42-05.147679.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.47207806761545057,
"acc_stderr": 0.0343560515515407,
"acc_norm": 0.4785715950847517,
"acc_norm_stderr": 0.03514049991292305,
"mc1": 0.30966952264381886,
"mc1_stderr": 0.016185744355144905,
"mc2": 0.47217481958020674,
"mc2_stderr": 0.01506601596455064
},
"harness|arc:challenge|25": {
"acc": 0.4761092150170648,
"acc_stderr": 0.014594701798071654,
"acc_norm": 0.5264505119453925,
"acc_norm_stderr": 0.014590931358120174
},
"harness|hellaswag|10": {
"acc": 0.5748854809798845,
"acc_stderr": 0.004933500261683596,
"acc_norm": 0.767078271260705,
"acc_norm_stderr": 0.004218289279767987
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4740740740740741,
"acc_stderr": 0.04313531696750575,
"acc_norm": 0.4740740740740741,
"acc_norm_stderr": 0.04313531696750575
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4934210526315789,
"acc_stderr": 0.040685900502249704,
"acc_norm": 0.4934210526315789,
"acc_norm_stderr": 0.040685900502249704
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5584905660377358,
"acc_stderr": 0.030561590426731837,
"acc_norm": 0.5584905660377358,
"acc_norm_stderr": 0.030561590426731837
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4583333333333333,
"acc_stderr": 0.04166666666666666,
"acc_norm": 0.4583333333333333,
"acc_norm_stderr": 0.04166666666666666
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411018,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411018
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.42196531791907516,
"acc_stderr": 0.037657466938651504,
"acc_norm": 0.42196531791907516,
"acc_norm_stderr": 0.037657466938651504
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.042207736591714506,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.042207736591714506
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3659574468085106,
"acc_stderr": 0.031489558297455304,
"acc_norm": 0.3659574468085106,
"acc_norm_stderr": 0.031489558297455304
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.19298245614035087,
"acc_stderr": 0.037124548537213684,
"acc_norm": 0.19298245614035087,
"acc_norm_stderr": 0.037124548537213684
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4689655172413793,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.4689655172413793,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.31216931216931215,
"acc_stderr": 0.023865206836972602,
"acc_norm": 0.31216931216931215,
"acc_norm_stderr": 0.023865206836972602
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.29365079365079366,
"acc_stderr": 0.04073524322147125,
"acc_norm": 0.29365079365079366,
"acc_norm_stderr": 0.04073524322147125
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5193548387096775,
"acc_stderr": 0.028422687404312107,
"acc_norm": 0.5193548387096775,
"acc_norm_stderr": 0.028422687404312107
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3694581280788177,
"acc_stderr": 0.03395970381998574,
"acc_norm": 0.3694581280788177,
"acc_norm_stderr": 0.03395970381998574
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5515151515151515,
"acc_stderr": 0.038835659779569286,
"acc_norm": 0.5515151515151515,
"acc_norm_stderr": 0.038835659779569286
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6212121212121212,
"acc_stderr": 0.03456088731993747,
"acc_norm": 0.6212121212121212,
"acc_norm_stderr": 0.03456088731993747
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.694300518134715,
"acc_stderr": 0.033248379397581594,
"acc_norm": 0.694300518134715,
"acc_norm_stderr": 0.033248379397581594
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4666666666666667,
"acc_stderr": 0.025294608023986472,
"acc_norm": 0.4666666666666667,
"acc_norm_stderr": 0.025294608023986472
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.22962962962962963,
"acc_stderr": 0.025644108639267638,
"acc_norm": 0.22962962962962963,
"acc_norm_stderr": 0.025644108639267638
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.031968769891957786,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.031968769891957786
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.03802039760107903,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.03802039760107903
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6256880733944954,
"acc_stderr": 0.020748959408988313,
"acc_norm": 0.6256880733944954,
"acc_norm_stderr": 0.020748959408988313
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.44907407407407407,
"acc_stderr": 0.03392238405321617,
"acc_norm": 0.44907407407407407,
"acc_norm_stderr": 0.03392238405321617
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5980392156862745,
"acc_stderr": 0.03441190023482465,
"acc_norm": 0.5980392156862745,
"acc_norm_stderr": 0.03441190023482465
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6244725738396625,
"acc_stderr": 0.03152256243091156,
"acc_norm": 0.6244725738396625,
"acc_norm_stderr": 0.03152256243091156
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.57847533632287,
"acc_stderr": 0.03314190222110657,
"acc_norm": 0.57847533632287,
"acc_norm_stderr": 0.03314190222110657
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5648854961832062,
"acc_stderr": 0.04348208051644858,
"acc_norm": 0.5648854961832062,
"acc_norm_stderr": 0.04348208051644858
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6446280991735537,
"acc_stderr": 0.0436923632657398,
"acc_norm": 0.6446280991735537,
"acc_norm_stderr": 0.0436923632657398
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.04668408033024931,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.04668408033024931
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5398773006134969,
"acc_stderr": 0.03915857291436971,
"acc_norm": 0.5398773006134969,
"acc_norm_stderr": 0.03915857291436971
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04547960999764376,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04547960999764376
},
"harness|hendrycksTest-management|5": {
"acc": 0.6116504854368932,
"acc_stderr": 0.0482572933735639,
"acc_norm": 0.6116504854368932,
"acc_norm_stderr": 0.0482572933735639
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6837606837606838,
"acc_stderr": 0.03046365674734025,
"acc_norm": 0.6837606837606838,
"acc_norm_stderr": 0.03046365674734025
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.644955300127714,
"acc_stderr": 0.017112085772772994,
"acc_norm": 0.644955300127714,
"acc_norm_stderr": 0.017112085772772994
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5,
"acc_stderr": 0.026919095102908273,
"acc_norm": 0.5,
"acc_norm_stderr": 0.026919095102908273
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5490196078431373,
"acc_stderr": 0.02849199358617156,
"acc_norm": 0.5490196078431373,
"acc_norm_stderr": 0.02849199358617156
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5176848874598071,
"acc_stderr": 0.02838032284907713,
"acc_norm": 0.5176848874598071,
"acc_norm_stderr": 0.02838032284907713
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5648148148148148,
"acc_stderr": 0.0275860062216077,
"acc_norm": 0.5648148148148148,
"acc_norm_stderr": 0.0275860062216077
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3120567375886525,
"acc_stderr": 0.027640120545169924,
"acc_norm": 0.3120567375886525,
"acc_norm_stderr": 0.027640120545169924
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3494132985658409,
"acc_stderr": 0.012177306252786691,
"acc_norm": 0.3494132985658409,
"acc_norm_stderr": 0.012177306252786691
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4485294117647059,
"acc_stderr": 0.0302114796091216,
"acc_norm": 0.4485294117647059,
"acc_norm_stderr": 0.0302114796091216
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4297385620915033,
"acc_stderr": 0.020027122784928547,
"acc_norm": 0.4297385620915033,
"acc_norm_stderr": 0.020027122784928547
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.4090909090909091,
"acc_stderr": 0.04709306978661897,
"acc_norm": 0.4090909090909091,
"acc_norm_stderr": 0.04709306978661897
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5387755102040817,
"acc_stderr": 0.031912820526692774,
"acc_norm": 0.5387755102040817,
"acc_norm_stderr": 0.031912820526692774
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6417910447761194,
"acc_stderr": 0.03390393042268814,
"acc_norm": 0.6417910447761194,
"acc_norm_stderr": 0.03390393042268814
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-virology|5": {
"acc": 0.35542168674698793,
"acc_stderr": 0.03726214354322416,
"acc_norm": 0.35542168674698793,
"acc_norm_stderr": 0.03726214354322416
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6140350877192983,
"acc_stderr": 0.03733756969066164,
"acc_norm": 0.6140350877192983,
"acc_norm_stderr": 0.03733756969066164
},
"harness|truthfulqa:mc|0": {
"mc1": 0.30966952264381886,
"mc1_stderr": 0.016185744355144905,
"mc2": 0.47217481958020674,
"mc2_stderr": 0.01506601596455064
},
"harness|winogrande|5": {
"acc": 0.6906077348066298,
"acc_stderr": 0.012991329330822993
},
"harness|gsm8k|5": {
"acc": 0.11296436694465505,
"acc_stderr": 0.008719339028833054
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_wang7776__Mistral-7B-Instruct-v0.2-sparsity-20 | [
"region:us"
] | 2024-01-15T10:43:54+00:00 | {"pretty_name": "Evaluation run of wang7776/Mistral-7B-Instruct-v0.2-sparsity-20", "dataset_summary": "Dataset automatically created during the evaluation run of model [wang7776/Mistral-7B-Instruct-v0.2-sparsity-20](https://huggingface.co/wang7776/Mistral-7B-Instruct-v0.2-sparsity-20) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_wang7776__Mistral-7B-Instruct-v0.2-sparsity-20\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-15T10:42:05.147679](https://huggingface.co/datasets/open-llm-leaderboard/details_wang7776__Mistral-7B-Instruct-v0.2-sparsity-20/blob/main/results_2024-01-15T10-42-05.147679.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.47207806761545057,\n \"acc_stderr\": 0.0343560515515407,\n \"acc_norm\": 0.4785715950847517,\n \"acc_norm_stderr\": 0.03514049991292305,\n \"mc1\": 0.30966952264381886,\n \"mc1_stderr\": 0.016185744355144905,\n \"mc2\": 0.47217481958020674,\n \"mc2_stderr\": 0.01506601596455064\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.4761092150170648,\n \"acc_stderr\": 0.014594701798071654,\n \"acc_norm\": 0.5264505119453925,\n \"acc_norm_stderr\": 0.014590931358120174\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5748854809798845,\n \"acc_stderr\": 0.004933500261683596,\n \"acc_norm\": 0.767078271260705,\n \"acc_norm_stderr\": 0.004218289279767987\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4740740740740741,\n \"acc_stderr\": 0.04313531696750575,\n \"acc_norm\": 0.4740740740740741,\n \"acc_norm_stderr\": 0.04313531696750575\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.4934210526315789,\n \"acc_stderr\": 0.040685900502249704,\n \"acc_norm\": 0.4934210526315789,\n \"acc_norm_stderr\": 0.040685900502249704\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5584905660377358,\n \"acc_stderr\": 0.030561590426731837,\n \"acc_norm\": 0.5584905660377358,\n \"acc_norm_stderr\": 0.030561590426731837\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4583333333333333,\n \"acc_stderr\": 0.04166666666666666,\n \"acc_norm\": 0.4583333333333333,\n \"acc_norm_stderr\": 0.04166666666666666\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411018,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411018\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.42196531791907516,\n \"acc_stderr\": 0.037657466938651504,\n \"acc_norm\": 0.42196531791907516,\n \"acc_norm_stderr\": 0.037657466938651504\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.042207736591714506,\n \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.042207736591714506\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.3659574468085106,\n \"acc_stderr\": 0.031489558297455304,\n \"acc_norm\": 0.3659574468085106,\n \"acc_norm_stderr\": 0.031489558297455304\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.19298245614035087,\n \"acc_stderr\": 0.037124548537213684,\n \"acc_norm\": 0.19298245614035087,\n \"acc_norm_stderr\": 0.037124548537213684\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.4689655172413793,\n \"acc_stderr\": 0.04158632762097828,\n \"acc_norm\": 0.4689655172413793,\n \"acc_norm_stderr\": 0.04158632762097828\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.31216931216931215,\n \"acc_stderr\": 0.023865206836972602,\n \"acc_norm\": 0.31216931216931215,\n \"acc_norm_stderr\": 0.023865206836972602\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.29365079365079366,\n \"acc_stderr\": 0.04073524322147125,\n \"acc_norm\": 0.29365079365079366,\n \"acc_norm_stderr\": 0.04073524322147125\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5193548387096775,\n \"acc_stderr\": 0.028422687404312107,\n \"acc_norm\": 0.5193548387096775,\n \"acc_norm_stderr\": 0.028422687404312107\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3694581280788177,\n \"acc_stderr\": 0.03395970381998574,\n \"acc_norm\": 0.3694581280788177,\n \"acc_norm_stderr\": 0.03395970381998574\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.5515151515151515,\n \"acc_stderr\": 0.038835659779569286,\n \"acc_norm\": 0.5515151515151515,\n \"acc_norm_stderr\": 0.038835659779569286\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6212121212121212,\n \"acc_stderr\": 0.03456088731993747,\n \"acc_norm\": 0.6212121212121212,\n \"acc_norm_stderr\": 0.03456088731993747\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.694300518134715,\n \"acc_stderr\": 0.033248379397581594,\n \"acc_norm\": 0.694300518134715,\n \"acc_norm_stderr\": 0.033248379397581594\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.4666666666666667,\n \"acc_stderr\": 0.025294608023986472,\n \"acc_norm\": 0.4666666666666667,\n \"acc_norm_stderr\": 0.025294608023986472\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.22962962962962963,\n \"acc_stderr\": 0.025644108639267638,\n \"acc_norm\": 0.22962962962962963,\n \"acc_norm_stderr\": 0.025644108639267638\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.031968769891957786,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.031968769891957786\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31788079470198677,\n \"acc_stderr\": 0.03802039760107903,\n \"acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.03802039760107903\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.6256880733944954,\n \"acc_stderr\": 0.020748959408988313,\n \"acc_norm\": 0.6256880733944954,\n \"acc_norm_stderr\": 0.020748959408988313\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.44907407407407407,\n \"acc_stderr\": 0.03392238405321617,\n \"acc_norm\": 0.44907407407407407,\n \"acc_norm_stderr\": 0.03392238405321617\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.5980392156862745,\n \"acc_stderr\": 0.03441190023482465,\n \"acc_norm\": 0.5980392156862745,\n \"acc_norm_stderr\": 0.03441190023482465\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.6244725738396625,\n \"acc_stderr\": 0.03152256243091156,\n \"acc_norm\": 0.6244725738396625,\n \"acc_norm_stderr\": 0.03152256243091156\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.57847533632287,\n \"acc_stderr\": 0.03314190222110657,\n \"acc_norm\": 0.57847533632287,\n \"acc_norm_stderr\": 0.03314190222110657\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5648854961832062,\n \"acc_stderr\": 0.04348208051644858,\n \"acc_norm\": 0.5648854961832062,\n \"acc_norm_stderr\": 0.04348208051644858\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6446280991735537,\n \"acc_stderr\": 0.0436923632657398,\n \"acc_norm\": 0.6446280991735537,\n \"acc_norm_stderr\": 0.0436923632657398\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6296296296296297,\n \"acc_stderr\": 0.04668408033024931,\n \"acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.04668408033024931\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.5398773006134969,\n \"acc_stderr\": 0.03915857291436971,\n \"acc_norm\": 0.5398773006134969,\n \"acc_norm_stderr\": 0.03915857291436971\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.35714285714285715,\n \"acc_stderr\": 0.04547960999764376,\n \"acc_norm\": 0.35714285714285715,\n \"acc_norm_stderr\": 0.04547960999764376\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6116504854368932,\n \"acc_stderr\": 0.0482572933735639,\n \"acc_norm\": 0.6116504854368932,\n \"acc_norm_stderr\": 0.0482572933735639\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6837606837606838,\n \"acc_stderr\": 0.03046365674734025,\n \"acc_norm\": 0.6837606837606838,\n \"acc_norm_stderr\": 0.03046365674734025\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.644955300127714,\n \"acc_stderr\": 0.017112085772772994,\n \"acc_norm\": 0.644955300127714,\n \"acc_norm_stderr\": 0.017112085772772994\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.026919095102908273,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.026919095102908273\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5490196078431373,\n \"acc_stderr\": 0.02849199358617156,\n \"acc_norm\": 0.5490196078431373,\n \"acc_norm_stderr\": 0.02849199358617156\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5176848874598071,\n \"acc_stderr\": 0.02838032284907713,\n \"acc_norm\": 0.5176848874598071,\n \"acc_norm_stderr\": 0.02838032284907713\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5648148148148148,\n \"acc_stderr\": 0.0275860062216077,\n \"acc_norm\": 0.5648148148148148,\n \"acc_norm_stderr\": 0.0275860062216077\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3120567375886525,\n \"acc_stderr\": 0.027640120545169924,\n \"acc_norm\": 0.3120567375886525,\n \"acc_norm_stderr\": 0.027640120545169924\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3494132985658409,\n \"acc_stderr\": 0.012177306252786691,\n \"acc_norm\": 0.3494132985658409,\n \"acc_norm_stderr\": 0.012177306252786691\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.4485294117647059,\n \"acc_stderr\": 0.0302114796091216,\n \"acc_norm\": 0.4485294117647059,\n \"acc_norm_stderr\": 0.0302114796091216\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.4297385620915033,\n \"acc_stderr\": 0.020027122784928547,\n \"acc_norm\": 0.4297385620915033,\n \"acc_norm_stderr\": 0.020027122784928547\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.4090909090909091,\n \"acc_stderr\": 0.04709306978661897,\n \"acc_norm\": 0.4090909090909091,\n \"acc_norm_stderr\": 0.04709306978661897\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.5387755102040817,\n \"acc_stderr\": 0.031912820526692774,\n \"acc_norm\": 0.5387755102040817,\n \"acc_norm_stderr\": 0.031912820526692774\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6417910447761194,\n \"acc_stderr\": 0.03390393042268814,\n \"acc_norm\": 0.6417910447761194,\n \"acc_norm_stderr\": 0.03390393042268814\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.35542168674698793,\n \"acc_stderr\": 0.03726214354322416,\n \"acc_norm\": 0.35542168674698793,\n \"acc_norm_stderr\": 0.03726214354322416\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.6140350877192983,\n \"acc_stderr\": 0.03733756969066164,\n \"acc_norm\": 0.6140350877192983,\n \"acc_norm_stderr\": 0.03733756969066164\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.30966952264381886,\n \"mc1_stderr\": 0.016185744355144905,\n \"mc2\": 0.47217481958020674,\n \"mc2_stderr\": 0.01506601596455064\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6906077348066298,\n \"acc_stderr\": 0.012991329330822993\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.11296436694465505,\n \"acc_stderr\": 0.008719339028833054\n }\n}\n```", "repo_url": "https://huggingface.co/wang7776/Mistral-7B-Instruct-v0.2-sparsity-20", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_15T10_42_05.147679", "path": ["**/details_harness|arc:challenge|25_2024-01-15T10-42-05.147679.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-15T10-42-05.147679.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_15T10_42_05.147679", "path": ["**/details_harness|gsm8k|5_2024-01-15T10-42-05.147679.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-15T10-42-05.147679.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_15T10_42_05.147679", "path": ["**/details_harness|hellaswag|10_2024-01-15T10-42-05.147679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-15T10-42-05.147679.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_15T10_42_05.147679", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-15T10-42-05.147679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-15T10-42-05.147679.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-15T10-42-05.147679.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_15T10_42_05.147679", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T10-42-05.147679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T10-42-05.147679.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_15T10_42_05.147679", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-15T10-42-05.147679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-15T10-42-05.147679.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_15T10_42_05.147679", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-15T10-42-05.147679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-15T10-42-05.147679.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_15T10_42_05.147679", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T10-42-05.147679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T10-42-05.147679.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_15T10_42_05.147679", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T10-42-05.147679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T10-42-05.147679.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_15T10_42_05.147679", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-15T10-42-05.147679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-15T10-42-05.147679.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_15T10_42_05.147679", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T10-42-05.147679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T10-42-05.147679.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_15T10_42_05.147679", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T10-42-05.147679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T10-42-05.147679.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_15T10_42_05.147679", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T10-42-05.147679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T10-42-05.147679.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_15T10_42_05.147679", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T10-42-05.147679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T10-42-05.147679.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_15T10_42_05.147679", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-15T10-42-05.147679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-15T10-42-05.147679.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_15T10_42_05.147679", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-15T10-42-05.147679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-15T10-42-05.147679.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_15T10_42_05.147679", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T10-42-05.147679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T10-42-05.147679.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_15T10_42_05.147679", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-15T10-42-05.147679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-15T10-42-05.147679.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_15T10_42_05.147679", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T10-42-05.147679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T10-42-05.147679.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_15T10_42_05.147679", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T10-42-05.147679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T10-42-05.147679.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_15T10_42_05.147679", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T10-42-05.147679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T10-42-05.147679.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_15T10_42_05.147679", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-15T10-42-05.147679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-15T10-42-05.147679.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_15T10_42_05.147679", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T10-42-05.147679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T10-42-05.147679.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_15T10_42_05.147679", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T10-42-05.147679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T10-42-05.147679.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_15T10_42_05.147679", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T10-42-05.147679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T10-42-05.147679.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_15T10_42_05.147679", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T10-42-05.147679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T10-42-05.147679.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_15T10_42_05.147679", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T10-42-05.147679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T10-42-05.147679.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_15T10_42_05.147679", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T10-42-05.147679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T10-42-05.147679.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_15T10_42_05.147679", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T10-42-05.147679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T10-42-05.147679.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_15T10_42_05.147679", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T10-42-05.147679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T10-42-05.147679.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_15T10_42_05.147679", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T10-42-05.147679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T10-42-05.147679.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_15T10_42_05.147679", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T10-42-05.147679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T10-42-05.147679.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_15T10_42_05.147679", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T10-42-05.147679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T10-42-05.147679.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_15T10_42_05.147679", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T10-42-05.147679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T10-42-05.147679.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_15T10_42_05.147679", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T10-42-05.147679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T10-42-05.147679.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_15T10_42_05.147679", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T10-42-05.147679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T10-42-05.147679.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_15T10_42_05.147679", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-15T10-42-05.147679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-15T10-42-05.147679.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_15T10_42_05.147679", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T10-42-05.147679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T10-42-05.147679.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_15T10_42_05.147679", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-15T10-42-05.147679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-15T10-42-05.147679.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_15T10_42_05.147679", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T10-42-05.147679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T10-42-05.147679.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_15T10_42_05.147679", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T10-42-05.147679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T10-42-05.147679.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_15T10_42_05.147679", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T10-42-05.147679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T10-42-05.147679.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_15T10_42_05.147679", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-15T10-42-05.147679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-15T10-42-05.147679.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_15T10_42_05.147679", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-15T10-42-05.147679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-15T10-42-05.147679.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_15T10_42_05.147679", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T10-42-05.147679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T10-42-05.147679.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_15T10_42_05.147679", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T10-42-05.147679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T10-42-05.147679.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_15T10_42_05.147679", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T10-42-05.147679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T10-42-05.147679.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_15T10_42_05.147679", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T10-42-05.147679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T10-42-05.147679.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_15T10_42_05.147679", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-15T10-42-05.147679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-15T10-42-05.147679.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_15T10_42_05.147679", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-15T10-42-05.147679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-15T10-42-05.147679.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_15T10_42_05.147679", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-15T10-42-05.147679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-15T10-42-05.147679.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_15T10_42_05.147679", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T10-42-05.147679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T10-42-05.147679.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_15T10_42_05.147679", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-15T10-42-05.147679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-15T10-42-05.147679.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_15T10_42_05.147679", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T10-42-05.147679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T10-42-05.147679.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_15T10_42_05.147679", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T10-42-05.147679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T10-42-05.147679.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_15T10_42_05.147679", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-15T10-42-05.147679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-15T10-42-05.147679.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_15T10_42_05.147679", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-15T10-42-05.147679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-15T10-42-05.147679.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_15T10_42_05.147679", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-15T10-42-05.147679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-15T10-42-05.147679.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_15T10_42_05.147679", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T10-42-05.147679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T10-42-05.147679.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_15T10_42_05.147679", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-15T10-42-05.147679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-15T10-42-05.147679.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_15T10_42_05.147679", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-15T10-42-05.147679.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-15T10-42-05.147679.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_15T10_42_05.147679", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-15T10-42-05.147679.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-15T10-42-05.147679.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_15T10_42_05.147679", "path": ["**/details_harness|winogrande|5_2024-01-15T10-42-05.147679.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-15T10-42-05.147679.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_15T10_42_05.147679", "path": ["results_2024-01-15T10-42-05.147679.parquet"]}, {"split": "latest", "path": ["results_2024-01-15T10-42-05.147679.parquet"]}]}]} | 2024-01-15T10:44:15+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of wang7776/Mistral-7B-Instruct-v0.2-sparsity-20
Dataset automatically created during the evaluation run of model wang7776/Mistral-7B-Instruct-v0.2-sparsity-20 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-15T10:42:05.147679(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of wang7776/Mistral-7B-Instruct-v0.2-sparsity-20\n\n\n\nDataset automatically created during the evaluation run of model wang7776/Mistral-7B-Instruct-v0.2-sparsity-20 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-15T10:42:05.147679(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of wang7776/Mistral-7B-Instruct-v0.2-sparsity-20\n\n\n\nDataset automatically created during the evaluation run of model wang7776/Mistral-7B-Instruct-v0.2-sparsity-20 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-15T10:42:05.147679(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
726ba663341a8ba9c96ee5d64cd217da38e3ba98 |
# Dataset Card for Evaluation run of uukuguy/speechless-nl2sql-ds-6.7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [uukuguy/speechless-nl2sql-ds-6.7b](https://huggingface.co/uukuguy/speechless-nl2sql-ds-6.7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_uukuguy__speechless-nl2sql-ds-6.7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-15T10:49:31.946178](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__speechless-nl2sql-ds-6.7b/blob/main/results_2024-01-15T10-49-31.946178.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.36753106147172254,
"acc_stderr": 0.034148744304961676,
"acc_norm": 0.37064831951086746,
"acc_norm_stderr": 0.034909927450973086,
"mc1": 0.2386780905752754,
"mc1_stderr": 0.014922629695456418,
"mc2": 0.40545579003887305,
"mc2_stderr": 0.014846664096046035
},
"harness|arc:challenge|25": {
"acc": 0.32764505119453924,
"acc_stderr": 0.013715847940719342,
"acc_norm": 0.363481228668942,
"acc_norm_stderr": 0.014056207319068282
},
"harness|hellaswag|10": {
"acc": 0.4047998406691894,
"acc_stderr": 0.004898501014225839,
"acc_norm": 0.528281218880701,
"acc_norm_stderr": 0.00498179308984826
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.04094376269996794,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.04094376269996794
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3684210526315789,
"acc_stderr": 0.03925523381052932,
"acc_norm": 0.3684210526315789,
"acc_norm_stderr": 0.03925523381052932
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.37735849056603776,
"acc_stderr": 0.029832808114796005,
"acc_norm": 0.37735849056603776,
"acc_norm_stderr": 0.029832808114796005
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3402777777777778,
"acc_stderr": 0.039621355734862175,
"acc_norm": 0.3402777777777778,
"acc_norm_stderr": 0.039621355734862175
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3468208092485549,
"acc_stderr": 0.03629146670159663,
"acc_norm": 0.3468208092485549,
"acc_norm_stderr": 0.03629146670159663
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3137254901960784,
"acc_stderr": 0.04617034827006717,
"acc_norm": 0.3137254901960784,
"acc_norm_stderr": 0.04617034827006717
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3446808510638298,
"acc_stderr": 0.03106898596312215,
"acc_norm": 0.3446808510638298,
"acc_norm_stderr": 0.03106898596312215
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.040969851398436716,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.040969851398436716
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4689655172413793,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.4689655172413793,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3412698412698413,
"acc_stderr": 0.02441923496681907,
"acc_norm": 0.3412698412698413,
"acc_norm_stderr": 0.02441923496681907
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04216370213557835,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04216370213557835
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.38064516129032255,
"acc_stderr": 0.02762171783290704,
"acc_norm": 0.38064516129032255,
"acc_norm_stderr": 0.02762171783290704
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2955665024630542,
"acc_stderr": 0.032104944337514575,
"acc_norm": 0.2955665024630542,
"acc_norm_stderr": 0.032104944337514575
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.3939393939393939,
"acc_stderr": 0.0381549430868893,
"acc_norm": 0.3939393939393939,
"acc_norm_stderr": 0.0381549430868893
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.3939393939393939,
"acc_stderr": 0.03481285338232963,
"acc_norm": 0.3939393939393939,
"acc_norm_stderr": 0.03481285338232963
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.40414507772020725,
"acc_stderr": 0.0354150857888402,
"acc_norm": 0.40414507772020725,
"acc_norm_stderr": 0.0354150857888402
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.35128205128205126,
"acc_stderr": 0.024203665177902803,
"acc_norm": 0.35128205128205126,
"acc_norm_stderr": 0.024203665177902803
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25555555555555554,
"acc_stderr": 0.026593939101844058,
"acc_norm": 0.25555555555555554,
"acc_norm_stderr": 0.026593939101844058
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3739495798319328,
"acc_stderr": 0.031429466378837076,
"acc_norm": 0.3739495798319328,
"acc_norm_stderr": 0.031429466378837076
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.03734535676787198,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.03734535676787198
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3724770642201835,
"acc_stderr": 0.020728368457638494,
"acc_norm": 0.3724770642201835,
"acc_norm_stderr": 0.020728368457638494
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3472222222222222,
"acc_stderr": 0.032468872436376486,
"acc_norm": 0.3472222222222222,
"acc_norm_stderr": 0.032468872436376486
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.03228210387037892,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.03228210387037892
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.3037974683544304,
"acc_stderr": 0.02993669638713862,
"acc_norm": 0.3037974683544304,
"acc_norm_stderr": 0.02993669638713862
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.30493273542600896,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.30493273542600896,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.3816793893129771,
"acc_stderr": 0.04260735157644559,
"acc_norm": 0.3816793893129771,
"acc_norm_stderr": 0.04260735157644559
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5289256198347108,
"acc_stderr": 0.04556710331269498,
"acc_norm": 0.5289256198347108,
"acc_norm_stderr": 0.04556710331269498
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.047500773411999854,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.047500773411999854
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4110429447852761,
"acc_stderr": 0.038656978537853624,
"acc_norm": 0.4110429447852761,
"acc_norm_stderr": 0.038656978537853624
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.043270409325787296,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.043270409325787296
},
"harness|hendrycksTest-management|5": {
"acc": 0.46601941747572817,
"acc_stderr": 0.0493929144727348,
"acc_norm": 0.46601941747572817,
"acc_norm_stderr": 0.0493929144727348
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.5726495726495726,
"acc_stderr": 0.03240847393516327,
"acc_norm": 0.5726495726495726,
"acc_norm_stderr": 0.03240847393516327
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.388250319284802,
"acc_stderr": 0.017427673295544326,
"acc_norm": 0.388250319284802,
"acc_norm_stderr": 0.017427673295544326
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.3670520231213873,
"acc_stderr": 0.025950054337654085,
"acc_norm": 0.3670520231213873,
"acc_norm_stderr": 0.025950054337654085
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.25139664804469275,
"acc_stderr": 0.01450897945355399,
"acc_norm": 0.25139664804469275,
"acc_norm_stderr": 0.01450897945355399
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.3954248366013072,
"acc_stderr": 0.027996723180631438,
"acc_norm": 0.3954248366013072,
"acc_norm_stderr": 0.027996723180631438
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3890675241157556,
"acc_stderr": 0.027690337536485376,
"acc_norm": 0.3890675241157556,
"acc_norm_stderr": 0.027690337536485376
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.02540719779889017,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.02540719779889017
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2801418439716312,
"acc_stderr": 0.02678917235114024,
"acc_norm": 0.2801418439716312,
"acc_norm_stderr": 0.02678917235114024
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.28748370273794005,
"acc_stderr": 0.011559337355708505,
"acc_norm": 0.28748370273794005,
"acc_norm_stderr": 0.011559337355708505
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.43014705882352944,
"acc_stderr": 0.030074971917302875,
"acc_norm": 0.43014705882352944,
"acc_norm_stderr": 0.030074971917302875
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2696078431372549,
"acc_stderr": 0.017952449196987862,
"acc_norm": 0.2696078431372549,
"acc_norm_stderr": 0.017952449196987862
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5181818181818182,
"acc_stderr": 0.04785964010794916,
"acc_norm": 0.5181818181818182,
"acc_norm_stderr": 0.04785964010794916
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.39591836734693875,
"acc_stderr": 0.03130802899065686,
"acc_norm": 0.39591836734693875,
"acc_norm_stderr": 0.03130802899065686
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.4129353233830846,
"acc_stderr": 0.03481520803367348,
"acc_norm": 0.4129353233830846,
"acc_norm_stderr": 0.03481520803367348
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-virology|5": {
"acc": 0.37349397590361444,
"acc_stderr": 0.03765845117168864,
"acc_norm": 0.37349397590361444,
"acc_norm_stderr": 0.03765845117168864
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.4093567251461988,
"acc_stderr": 0.037712831076265434,
"acc_norm": 0.4093567251461988,
"acc_norm_stderr": 0.037712831076265434
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2386780905752754,
"mc1_stderr": 0.014922629695456418,
"mc2": 0.40545579003887305,
"mc2_stderr": 0.014846664096046035
},
"harness|winogrande|5": {
"acc": 0.5595895816890292,
"acc_stderr": 0.0139523303119156
},
"harness|gsm8k|5": {
"acc": 0.1508718726307809,
"acc_stderr": 0.009859004137305687
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_uukuguy__speechless-nl2sql-ds-6.7b | [
"region:us"
] | 2024-01-15T10:51:50+00:00 | {"pretty_name": "Evaluation run of uukuguy/speechless-nl2sql-ds-6.7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [uukuguy/speechless-nl2sql-ds-6.7b](https://huggingface.co/uukuguy/speechless-nl2sql-ds-6.7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_uukuguy__speechless-nl2sql-ds-6.7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-15T10:49:31.946178](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__speechless-nl2sql-ds-6.7b/blob/main/results_2024-01-15T10-49-31.946178.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.36753106147172254,\n \"acc_stderr\": 0.034148744304961676,\n \"acc_norm\": 0.37064831951086746,\n \"acc_norm_stderr\": 0.034909927450973086,\n \"mc1\": 0.2386780905752754,\n \"mc1_stderr\": 0.014922629695456418,\n \"mc2\": 0.40545579003887305,\n \"mc2_stderr\": 0.014846664096046035\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.32764505119453924,\n \"acc_stderr\": 0.013715847940719342,\n \"acc_norm\": 0.363481228668942,\n \"acc_norm_stderr\": 0.014056207319068282\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4047998406691894,\n \"acc_stderr\": 0.004898501014225839,\n \"acc_norm\": 0.528281218880701,\n \"acc_norm_stderr\": 0.00498179308984826\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.34074074074074073,\n \"acc_stderr\": 0.04094376269996794,\n \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.04094376269996794\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.3684210526315789,\n \"acc_stderr\": 0.03925523381052932,\n \"acc_norm\": 0.3684210526315789,\n \"acc_norm_stderr\": 0.03925523381052932\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.37735849056603776,\n \"acc_stderr\": 0.029832808114796005,\n \"acc_norm\": 0.37735849056603776,\n \"acc_norm_stderr\": 0.029832808114796005\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3402777777777778,\n \"acc_stderr\": 0.039621355734862175,\n \"acc_norm\": 0.3402777777777778,\n \"acc_norm_stderr\": 0.039621355734862175\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3468208092485549,\n \"acc_stderr\": 0.03629146670159663,\n \"acc_norm\": 0.3468208092485549,\n \"acc_norm_stderr\": 0.03629146670159663\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.04617034827006717,\n \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.04617034827006717\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.3446808510638298,\n \"acc_stderr\": 0.03106898596312215,\n \"acc_norm\": 0.3446808510638298,\n \"acc_norm_stderr\": 0.03106898596312215\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n \"acc_stderr\": 0.040969851398436716,\n \"acc_norm\": 0.2543859649122807,\n \"acc_norm_stderr\": 0.040969851398436716\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.4689655172413793,\n \"acc_stderr\": 0.04158632762097828,\n \"acc_norm\": 0.4689655172413793,\n \"acc_norm_stderr\": 0.04158632762097828\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3412698412698413,\n \"acc_stderr\": 0.02441923496681907,\n \"acc_norm\": 0.3412698412698413,\n \"acc_norm_stderr\": 0.02441923496681907\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04216370213557835,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04216370213557835\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.38064516129032255,\n \"acc_stderr\": 0.02762171783290704,\n \"acc_norm\": 0.38064516129032255,\n \"acc_norm_stderr\": 0.02762171783290704\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.2955665024630542,\n \"acc_stderr\": 0.032104944337514575,\n \"acc_norm\": 0.2955665024630542,\n \"acc_norm_stderr\": 0.032104944337514575\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.3939393939393939,\n \"acc_stderr\": 0.0381549430868893,\n \"acc_norm\": 0.3939393939393939,\n \"acc_norm_stderr\": 0.0381549430868893\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.3939393939393939,\n \"acc_stderr\": 0.03481285338232963,\n \"acc_norm\": 0.3939393939393939,\n \"acc_norm_stderr\": 0.03481285338232963\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.40414507772020725,\n \"acc_stderr\": 0.0354150857888402,\n \"acc_norm\": 0.40414507772020725,\n \"acc_norm_stderr\": 0.0354150857888402\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.35128205128205126,\n \"acc_stderr\": 0.024203665177902803,\n \"acc_norm\": 0.35128205128205126,\n \"acc_norm_stderr\": 0.024203665177902803\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.25555555555555554,\n \"acc_stderr\": 0.026593939101844058,\n \"acc_norm\": 0.25555555555555554,\n \"acc_norm_stderr\": 0.026593939101844058\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.3739495798319328,\n \"acc_stderr\": 0.031429466378837076,\n \"acc_norm\": 0.3739495798319328,\n \"acc_norm_stderr\": 0.031429466378837076\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2980132450331126,\n \"acc_stderr\": 0.03734535676787198,\n \"acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.03734535676787198\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.3724770642201835,\n \"acc_stderr\": 0.020728368457638494,\n \"acc_norm\": 0.3724770642201835,\n \"acc_norm_stderr\": 0.020728368457638494\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.3472222222222222,\n \"acc_stderr\": 0.032468872436376486,\n \"acc_norm\": 0.3472222222222222,\n \"acc_norm_stderr\": 0.032468872436376486\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.03228210387037892,\n \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.03228210387037892\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.3037974683544304,\n \"acc_stderr\": 0.02993669638713862,\n \"acc_norm\": 0.3037974683544304,\n \"acc_norm_stderr\": 0.02993669638713862\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.30493273542600896,\n \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.30493273542600896,\n \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.3816793893129771,\n \"acc_stderr\": 0.04260735157644559,\n \"acc_norm\": 0.3816793893129771,\n \"acc_norm_stderr\": 0.04260735157644559\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.5289256198347108,\n \"acc_stderr\": 0.04556710331269498,\n \"acc_norm\": 0.5289256198347108,\n \"acc_norm_stderr\": 0.04556710331269498\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4074074074074074,\n \"acc_stderr\": 0.047500773411999854,\n \"acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.047500773411999854\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.4110429447852761,\n \"acc_stderr\": 0.038656978537853624,\n \"acc_norm\": 0.4110429447852761,\n \"acc_norm_stderr\": 0.038656978537853624\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n \"acc_stderr\": 0.043270409325787296,\n \"acc_norm\": 0.29464285714285715,\n \"acc_norm_stderr\": 0.043270409325787296\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.46601941747572817,\n \"acc_stderr\": 0.0493929144727348,\n \"acc_norm\": 0.46601941747572817,\n \"acc_norm_stderr\": 0.0493929144727348\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.5726495726495726,\n \"acc_stderr\": 0.03240847393516327,\n \"acc_norm\": 0.5726495726495726,\n \"acc_norm_stderr\": 0.03240847393516327\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.388250319284802,\n \"acc_stderr\": 0.017427673295544326,\n \"acc_norm\": 0.388250319284802,\n \"acc_norm_stderr\": 0.017427673295544326\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.3670520231213873,\n \"acc_stderr\": 0.025950054337654085,\n \"acc_norm\": 0.3670520231213873,\n \"acc_norm_stderr\": 0.025950054337654085\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.25139664804469275,\n \"acc_stderr\": 0.01450897945355399,\n \"acc_norm\": 0.25139664804469275,\n \"acc_norm_stderr\": 0.01450897945355399\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.3954248366013072,\n \"acc_stderr\": 0.027996723180631438,\n \"acc_norm\": 0.3954248366013072,\n \"acc_norm_stderr\": 0.027996723180631438\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3890675241157556,\n \"acc_stderr\": 0.027690337536485376,\n \"acc_norm\": 0.3890675241157556,\n \"acc_norm_stderr\": 0.027690337536485376\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.2962962962962963,\n \"acc_stderr\": 0.02540719779889017,\n \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.02540719779889017\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.2801418439716312,\n \"acc_stderr\": 0.02678917235114024,\n \"acc_norm\": 0.2801418439716312,\n \"acc_norm_stderr\": 0.02678917235114024\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.28748370273794005,\n \"acc_stderr\": 0.011559337355708505,\n \"acc_norm\": 0.28748370273794005,\n \"acc_norm_stderr\": 0.011559337355708505\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.43014705882352944,\n \"acc_stderr\": 0.030074971917302875,\n \"acc_norm\": 0.43014705882352944,\n \"acc_norm_stderr\": 0.030074971917302875\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.2696078431372549,\n \"acc_stderr\": 0.017952449196987862,\n \"acc_norm\": 0.2696078431372549,\n \"acc_norm_stderr\": 0.017952449196987862\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5181818181818182,\n \"acc_stderr\": 0.04785964010794916,\n \"acc_norm\": 0.5181818181818182,\n \"acc_norm_stderr\": 0.04785964010794916\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.39591836734693875,\n \"acc_stderr\": 0.03130802899065686,\n \"acc_norm\": 0.39591836734693875,\n \"acc_norm_stderr\": 0.03130802899065686\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.4129353233830846,\n \"acc_stderr\": 0.03481520803367348,\n \"acc_norm\": 0.4129353233830846,\n \"acc_norm_stderr\": 0.03481520803367348\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.37349397590361444,\n \"acc_stderr\": 0.03765845117168864,\n \"acc_norm\": 0.37349397590361444,\n \"acc_norm_stderr\": 0.03765845117168864\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.4093567251461988,\n \"acc_stderr\": 0.037712831076265434,\n \"acc_norm\": 0.4093567251461988,\n \"acc_norm_stderr\": 0.037712831076265434\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2386780905752754,\n \"mc1_stderr\": 0.014922629695456418,\n \"mc2\": 0.40545579003887305,\n \"mc2_stderr\": 0.014846664096046035\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5595895816890292,\n \"acc_stderr\": 0.0139523303119156\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1508718726307809,\n \"acc_stderr\": 0.009859004137305687\n }\n}\n```", "repo_url": "https://huggingface.co/uukuguy/speechless-nl2sql-ds-6.7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_15T10_49_31.946178", "path": ["**/details_harness|arc:challenge|25_2024-01-15T10-49-31.946178.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-15T10-49-31.946178.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_15T10_49_31.946178", "path": ["**/details_harness|gsm8k|5_2024-01-15T10-49-31.946178.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-15T10-49-31.946178.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_15T10_49_31.946178", "path": ["**/details_harness|hellaswag|10_2024-01-15T10-49-31.946178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-15T10-49-31.946178.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_15T10_49_31.946178", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-15T10-49-31.946178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-15T10-49-31.946178.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-15T10-49-31.946178.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_15T10_49_31.946178", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T10-49-31.946178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T10-49-31.946178.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_15T10_49_31.946178", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-15T10-49-31.946178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-15T10-49-31.946178.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_15T10_49_31.946178", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-15T10-49-31.946178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-15T10-49-31.946178.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_15T10_49_31.946178", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T10-49-31.946178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T10-49-31.946178.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_15T10_49_31.946178", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T10-49-31.946178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T10-49-31.946178.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_15T10_49_31.946178", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-15T10-49-31.946178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-15T10-49-31.946178.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_15T10_49_31.946178", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T10-49-31.946178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T10-49-31.946178.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_15T10_49_31.946178", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T10-49-31.946178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T10-49-31.946178.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_15T10_49_31.946178", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T10-49-31.946178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T10-49-31.946178.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_15T10_49_31.946178", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T10-49-31.946178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T10-49-31.946178.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_15T10_49_31.946178", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-15T10-49-31.946178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-15T10-49-31.946178.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_15T10_49_31.946178", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-15T10-49-31.946178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-15T10-49-31.946178.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_15T10_49_31.946178", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T10-49-31.946178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T10-49-31.946178.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_15T10_49_31.946178", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-15T10-49-31.946178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-15T10-49-31.946178.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_15T10_49_31.946178", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T10-49-31.946178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T10-49-31.946178.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_15T10_49_31.946178", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T10-49-31.946178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T10-49-31.946178.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_15T10_49_31.946178", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T10-49-31.946178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T10-49-31.946178.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_15T10_49_31.946178", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-15T10-49-31.946178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-15T10-49-31.946178.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_15T10_49_31.946178", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T10-49-31.946178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T10-49-31.946178.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_15T10_49_31.946178", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T10-49-31.946178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T10-49-31.946178.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_15T10_49_31.946178", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T10-49-31.946178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T10-49-31.946178.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_15T10_49_31.946178", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T10-49-31.946178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T10-49-31.946178.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_15T10_49_31.946178", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T10-49-31.946178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T10-49-31.946178.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_15T10_49_31.946178", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T10-49-31.946178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T10-49-31.946178.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_15T10_49_31.946178", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T10-49-31.946178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T10-49-31.946178.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_15T10_49_31.946178", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T10-49-31.946178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T10-49-31.946178.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_15T10_49_31.946178", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T10-49-31.946178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T10-49-31.946178.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_15T10_49_31.946178", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T10-49-31.946178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T10-49-31.946178.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_15T10_49_31.946178", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T10-49-31.946178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T10-49-31.946178.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_15T10_49_31.946178", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T10-49-31.946178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T10-49-31.946178.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_15T10_49_31.946178", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T10-49-31.946178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T10-49-31.946178.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_15T10_49_31.946178", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T10-49-31.946178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T10-49-31.946178.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_15T10_49_31.946178", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-15T10-49-31.946178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-15T10-49-31.946178.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_15T10_49_31.946178", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T10-49-31.946178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T10-49-31.946178.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_15T10_49_31.946178", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-15T10-49-31.946178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-15T10-49-31.946178.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_15T10_49_31.946178", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T10-49-31.946178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T10-49-31.946178.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_15T10_49_31.946178", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T10-49-31.946178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T10-49-31.946178.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_15T10_49_31.946178", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T10-49-31.946178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T10-49-31.946178.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_15T10_49_31.946178", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-15T10-49-31.946178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-15T10-49-31.946178.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_15T10_49_31.946178", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-15T10-49-31.946178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-15T10-49-31.946178.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_15T10_49_31.946178", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T10-49-31.946178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T10-49-31.946178.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_15T10_49_31.946178", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T10-49-31.946178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T10-49-31.946178.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_15T10_49_31.946178", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T10-49-31.946178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T10-49-31.946178.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_15T10_49_31.946178", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T10-49-31.946178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T10-49-31.946178.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_15T10_49_31.946178", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-15T10-49-31.946178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-15T10-49-31.946178.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_15T10_49_31.946178", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-15T10-49-31.946178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-15T10-49-31.946178.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_15T10_49_31.946178", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-15T10-49-31.946178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-15T10-49-31.946178.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_15T10_49_31.946178", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T10-49-31.946178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T10-49-31.946178.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_15T10_49_31.946178", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-15T10-49-31.946178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-15T10-49-31.946178.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_15T10_49_31.946178", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T10-49-31.946178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T10-49-31.946178.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_15T10_49_31.946178", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T10-49-31.946178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T10-49-31.946178.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_15T10_49_31.946178", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-15T10-49-31.946178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-15T10-49-31.946178.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_15T10_49_31.946178", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-15T10-49-31.946178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-15T10-49-31.946178.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_15T10_49_31.946178", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-15T10-49-31.946178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-15T10-49-31.946178.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_15T10_49_31.946178", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T10-49-31.946178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T10-49-31.946178.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_15T10_49_31.946178", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-15T10-49-31.946178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-15T10-49-31.946178.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_15T10_49_31.946178", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-15T10-49-31.946178.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-15T10-49-31.946178.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_15T10_49_31.946178", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-15T10-49-31.946178.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-15T10-49-31.946178.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_15T10_49_31.946178", "path": ["**/details_harness|winogrande|5_2024-01-15T10-49-31.946178.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-15T10-49-31.946178.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_15T10_49_31.946178", "path": ["results_2024-01-15T10-49-31.946178.parquet"]}, {"split": "latest", "path": ["results_2024-01-15T10-49-31.946178.parquet"]}]}]} | 2024-01-15T10:52:11+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of uukuguy/speechless-nl2sql-ds-6.7b
Dataset automatically created during the evaluation run of model uukuguy/speechless-nl2sql-ds-6.7b on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-15T10:49:31.946178(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of uukuguy/speechless-nl2sql-ds-6.7b\n\n\n\nDataset automatically created during the evaluation run of model uukuguy/speechless-nl2sql-ds-6.7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-15T10:49:31.946178(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of uukuguy/speechless-nl2sql-ds-6.7b\n\n\n\nDataset automatically created during the evaluation run of model uukuguy/speechless-nl2sql-ds-6.7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-15T10:49:31.946178(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
1ab66d0b374857b051097b301e1374c99ac816df |
# Dataset Card for Evaluation run of cloudyu/Qwen-72Bx2-MoE-120B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [cloudyu/Qwen-72Bx2-MoE-120B](https://huggingface.co/cloudyu/Qwen-72Bx2-MoE-120B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_cloudyu__Qwen-72Bx2-MoE-120B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-15T10:51:00.615971](https://huggingface.co/datasets/open-llm-leaderboard/details_cloudyu__Qwen-72Bx2-MoE-120B/blob/main/results_2024-01-15T10-51-00.615971.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.23280937126754725,
"acc_stderr": 0.030031934560283337,
"acc_norm": 0.2333859951011877,
"acc_norm_stderr": 0.030826897864000263,
"mc1": 0.22766217870257038,
"mc1_stderr": 0.01467925503211107,
"mc2": 0.4891376724889372,
"mc2_stderr": 0.016320771330589307
},
"harness|arc:challenge|25": {
"acc": 0.2090443686006826,
"acc_stderr": 0.011882746987406453,
"acc_norm": 0.2593856655290102,
"acc_norm_stderr": 0.012808273573927099
},
"harness|hellaswag|10": {
"acc": 0.2590121489743079,
"acc_stderr": 0.004371969542814558,
"acc_norm": 0.24905397331208923,
"acc_norm_stderr": 0.004315812968431582
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.1925925925925926,
"acc_stderr": 0.03406542058502653,
"acc_norm": 0.1925925925925926,
"acc_norm_stderr": 0.03406542058502653
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2188679245283019,
"acc_stderr": 0.025447863825108632,
"acc_norm": 0.2188679245283019,
"acc_norm_stderr": 0.025447863825108632
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749874,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20899470899470898,
"acc_stderr": 0.02094048156533486,
"acc_norm": 0.20899470899470898,
"acc_norm_stderr": 0.02094048156533486
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.03718489006818115,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.03718489006818115
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1774193548387097,
"acc_stderr": 0.02173254068932927,
"acc_norm": 0.1774193548387097,
"acc_norm_stderr": 0.02173254068932927
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.15270935960591134,
"acc_stderr": 0.02530890453938063,
"acc_norm": 0.15270935960591134,
"acc_norm_stderr": 0.02530890453938063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.17676767676767677,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.17676767676767677,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860664,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860664
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2358974358974359,
"acc_stderr": 0.021525965407408733,
"acc_norm": 0.2358974358974359,
"acc_norm_stderr": 0.021525965407408733
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2111111111111111,
"acc_stderr": 0.024882116857655075,
"acc_norm": 0.2111111111111111,
"acc_norm_stderr": 0.024882116857655075
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436776,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1926605504587156,
"acc_stderr": 0.016909276884936094,
"acc_norm": 0.1926605504587156,
"acc_norm_stderr": 0.016909276884936094
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1527777777777778,
"acc_stderr": 0.024536326026134224,
"acc_norm": 0.1527777777777778,
"acc_norm_stderr": 0.024536326026134224
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.31390134529147984,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.31390134529147984,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.28703703703703703,
"acc_stderr": 0.043733130409147614,
"acc_norm": 0.28703703703703703,
"acc_norm_stderr": 0.043733130409147614
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.3106796116504854,
"acc_stderr": 0.04582124160161549,
"acc_norm": 0.3106796116504854,
"acc_norm_stderr": 0.04582124160161549
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2863247863247863,
"acc_stderr": 0.029614323690456645,
"acc_norm": 0.2863247863247863,
"acc_norm_stderr": 0.029614323690456645
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.22860791826309068,
"acc_stderr": 0.015016884698539897,
"acc_norm": 0.22860791826309068,
"acc_norm_stderr": 0.015016884698539897
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351284,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351284
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.02212243977248077,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.02212243977248077
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432417,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432417
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.18775510204081633,
"acc_stderr": 0.02500025603954621,
"acc_norm": 0.18775510204081633,
"acc_norm_stderr": 0.02500025603954621
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.27485380116959063,
"acc_stderr": 0.034240429246915824,
"acc_norm": 0.27485380116959063,
"acc_norm_stderr": 0.034240429246915824
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22766217870257038,
"mc1_stderr": 0.01467925503211107,
"mc2": 0.4891376724889372,
"mc2_stderr": 0.016320771330589307
},
"harness|winogrande|5": {
"acc": 0.47198105761641673,
"acc_stderr": 0.014030404213405786
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_cloudyu__Qwen-72Bx2-MoE-120B | [
"region:us"
] | 2024-01-15T10:53:11+00:00 | {"pretty_name": "Evaluation run of cloudyu/Qwen-72Bx2-MoE-120B", "dataset_summary": "Dataset automatically created during the evaluation run of model [cloudyu/Qwen-72Bx2-MoE-120B](https://huggingface.co/cloudyu/Qwen-72Bx2-MoE-120B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_cloudyu__Qwen-72Bx2-MoE-120B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-15T10:51:00.615971](https://huggingface.co/datasets/open-llm-leaderboard/details_cloudyu__Qwen-72Bx2-MoE-120B/blob/main/results_2024-01-15T10-51-00.615971.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.23280937126754725,\n \"acc_stderr\": 0.030031934560283337,\n \"acc_norm\": 0.2333859951011877,\n \"acc_norm_stderr\": 0.030826897864000263,\n \"mc1\": 0.22766217870257038,\n \"mc1_stderr\": 0.01467925503211107,\n \"mc2\": 0.4891376724889372,\n \"mc2_stderr\": 0.016320771330589307\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.2090443686006826,\n \"acc_stderr\": 0.011882746987406453,\n \"acc_norm\": 0.2593856655290102,\n \"acc_norm_stderr\": 0.012808273573927099\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2590121489743079,\n \"acc_stderr\": 0.004371969542814558,\n \"acc_norm\": 0.24905397331208923,\n \"acc_norm_stderr\": 0.004315812968431582\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.1925925925925926,\n \"acc_stderr\": 0.03406542058502653,\n \"acc_norm\": 0.1925925925925926,\n \"acc_norm_stderr\": 0.03406542058502653\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.2188679245283019,\n \"acc_stderr\": 0.025447863825108632,\n \"acc_norm\": 0.2188679245283019,\n \"acc_norm_stderr\": 0.025447863825108632\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.03718489006818115,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.03718489006818115\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.1774193548387097,\n \"acc_stderr\": 0.02173254068932927,\n \"acc_norm\": 0.1774193548387097,\n \"acc_norm_stderr\": 0.02173254068932927\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.15270935960591134,\n \"acc_stderr\": 0.02530890453938063,\n \"acc_norm\": 0.15270935960591134,\n \"acc_norm_stderr\": 0.02530890453938063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.2358974358974359,\n \"acc_stderr\": 0.021525965407408733,\n \"acc_norm\": 0.2358974358974359,\n \"acc_norm_stderr\": 0.021525965407408733\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936094,\n \"acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936094\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.28703703703703703,\n \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.28703703703703703,\n \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.3106796116504854,\n \"acc_stderr\": 0.04582124160161549,\n \"acc_norm\": 0.3106796116504854,\n \"acc_norm_stderr\": 0.04582124160161549\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2863247863247863,\n \"acc_stderr\": 0.029614323690456645,\n \"acc_norm\": 0.2863247863247863,\n \"acc_norm_stderr\": 0.029614323690456645\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.22860791826309068,\n \"acc_stderr\": 0.015016884698539897,\n \"acc_norm\": 0.22860791826309068,\n \"acc_norm_stderr\": 0.015016884698539897\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.18775510204081633,\n \"acc_stderr\": 0.02500025603954621,\n \"acc_norm\": 0.18775510204081633,\n \"acc_norm_stderr\": 0.02500025603954621\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.27485380116959063,\n \"acc_stderr\": 0.034240429246915824,\n \"acc_norm\": 0.27485380116959063,\n \"acc_norm_stderr\": 0.034240429246915824\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22766217870257038,\n \"mc1_stderr\": 0.01467925503211107,\n \"mc2\": 0.4891376724889372,\n \"mc2_stderr\": 0.016320771330589307\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.47198105761641673,\n \"acc_stderr\": 0.014030404213405786\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/cloudyu/Qwen-72Bx2-MoE-120B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_15T10_51_00.615971", "path": ["**/details_harness|arc:challenge|25_2024-01-15T10-51-00.615971.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-15T10-51-00.615971.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_15T10_51_00.615971", "path": ["**/details_harness|gsm8k|5_2024-01-15T10-51-00.615971.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-15T10-51-00.615971.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_15T10_51_00.615971", "path": ["**/details_harness|hellaswag|10_2024-01-15T10-51-00.615971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-15T10-51-00.615971.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_15T10_51_00.615971", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-15T10-51-00.615971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-15T10-51-00.615971.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-15T10-51-00.615971.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_15T10_51_00.615971", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T10-51-00.615971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T10-51-00.615971.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_15T10_51_00.615971", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-15T10-51-00.615971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-15T10-51-00.615971.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_15T10_51_00.615971", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-15T10-51-00.615971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-15T10-51-00.615971.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_15T10_51_00.615971", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T10-51-00.615971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T10-51-00.615971.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_15T10_51_00.615971", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T10-51-00.615971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T10-51-00.615971.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_15T10_51_00.615971", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-15T10-51-00.615971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-15T10-51-00.615971.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_15T10_51_00.615971", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T10-51-00.615971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T10-51-00.615971.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_15T10_51_00.615971", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T10-51-00.615971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T10-51-00.615971.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_15T10_51_00.615971", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T10-51-00.615971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T10-51-00.615971.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_15T10_51_00.615971", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T10-51-00.615971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T10-51-00.615971.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_15T10_51_00.615971", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-15T10-51-00.615971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-15T10-51-00.615971.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_15T10_51_00.615971", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-15T10-51-00.615971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-15T10-51-00.615971.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_15T10_51_00.615971", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T10-51-00.615971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T10-51-00.615971.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_15T10_51_00.615971", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-15T10-51-00.615971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-15T10-51-00.615971.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_15T10_51_00.615971", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T10-51-00.615971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T10-51-00.615971.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_15T10_51_00.615971", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T10-51-00.615971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T10-51-00.615971.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_15T10_51_00.615971", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T10-51-00.615971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T10-51-00.615971.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_15T10_51_00.615971", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-15T10-51-00.615971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-15T10-51-00.615971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_15T10_51_00.615971", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T10-51-00.615971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T10-51-00.615971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_15T10_51_00.615971", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T10-51-00.615971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T10-51-00.615971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_15T10_51_00.615971", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T10-51-00.615971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T10-51-00.615971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_15T10_51_00.615971", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T10-51-00.615971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T10-51-00.615971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_15T10_51_00.615971", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T10-51-00.615971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T10-51-00.615971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_15T10_51_00.615971", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T10-51-00.615971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T10-51-00.615971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_15T10_51_00.615971", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T10-51-00.615971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T10-51-00.615971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_15T10_51_00.615971", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T10-51-00.615971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T10-51-00.615971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_15T10_51_00.615971", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T10-51-00.615971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T10-51-00.615971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_15T10_51_00.615971", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T10-51-00.615971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T10-51-00.615971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_15T10_51_00.615971", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T10-51-00.615971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T10-51-00.615971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_15T10_51_00.615971", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T10-51-00.615971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T10-51-00.615971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_15T10_51_00.615971", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T10-51-00.615971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T10-51-00.615971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_15T10_51_00.615971", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T10-51-00.615971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T10-51-00.615971.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_15T10_51_00.615971", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-15T10-51-00.615971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-15T10-51-00.615971.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_15T10_51_00.615971", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T10-51-00.615971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T10-51-00.615971.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_15T10_51_00.615971", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-15T10-51-00.615971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-15T10-51-00.615971.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_15T10_51_00.615971", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T10-51-00.615971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T10-51-00.615971.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_15T10_51_00.615971", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T10-51-00.615971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T10-51-00.615971.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_15T10_51_00.615971", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T10-51-00.615971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T10-51-00.615971.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_15T10_51_00.615971", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-15T10-51-00.615971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-15T10-51-00.615971.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_15T10_51_00.615971", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-15T10-51-00.615971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-15T10-51-00.615971.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_15T10_51_00.615971", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T10-51-00.615971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T10-51-00.615971.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_15T10_51_00.615971", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T10-51-00.615971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T10-51-00.615971.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_15T10_51_00.615971", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T10-51-00.615971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T10-51-00.615971.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_15T10_51_00.615971", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T10-51-00.615971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T10-51-00.615971.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_15T10_51_00.615971", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-15T10-51-00.615971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-15T10-51-00.615971.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_15T10_51_00.615971", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-15T10-51-00.615971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-15T10-51-00.615971.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_15T10_51_00.615971", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-15T10-51-00.615971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-15T10-51-00.615971.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_15T10_51_00.615971", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T10-51-00.615971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T10-51-00.615971.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_15T10_51_00.615971", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-15T10-51-00.615971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-15T10-51-00.615971.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_15T10_51_00.615971", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T10-51-00.615971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T10-51-00.615971.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_15T10_51_00.615971", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T10-51-00.615971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T10-51-00.615971.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_15T10_51_00.615971", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-15T10-51-00.615971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-15T10-51-00.615971.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_15T10_51_00.615971", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-15T10-51-00.615971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-15T10-51-00.615971.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_15T10_51_00.615971", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-15T10-51-00.615971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-15T10-51-00.615971.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_15T10_51_00.615971", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T10-51-00.615971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T10-51-00.615971.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_15T10_51_00.615971", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-15T10-51-00.615971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-15T10-51-00.615971.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_15T10_51_00.615971", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-15T10-51-00.615971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-15T10-51-00.615971.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_15T10_51_00.615971", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-15T10-51-00.615971.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-15T10-51-00.615971.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_15T10_51_00.615971", "path": ["**/details_harness|winogrande|5_2024-01-15T10-51-00.615971.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-15T10-51-00.615971.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_15T10_51_00.615971", "path": ["results_2024-01-15T10-51-00.615971.parquet"]}, {"split": "latest", "path": ["results_2024-01-15T10-51-00.615971.parquet"]}]}]} | 2024-01-15T10:53:30+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of cloudyu/Qwen-72Bx2-MoE-120B
Dataset automatically created during the evaluation run of model cloudyu/Qwen-72Bx2-MoE-120B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-15T10:51:00.615971(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of cloudyu/Qwen-72Bx2-MoE-120B\n\n\n\nDataset automatically created during the evaluation run of model cloudyu/Qwen-72Bx2-MoE-120B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-15T10:51:00.615971(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of cloudyu/Qwen-72Bx2-MoE-120B\n\n\n\nDataset automatically created during the evaluation run of model cloudyu/Qwen-72Bx2-MoE-120B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-15T10:51:00.615971(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
7be3e87059db8b08ecc590487c7616e30c652676 |
# Dataset Card for Evaluation run of wang7776/Mistral-7B-Instruct-v0.2-sparsity-30
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [wang7776/Mistral-7B-Instruct-v0.2-sparsity-30](https://huggingface.co/wang7776/Mistral-7B-Instruct-v0.2-sparsity-30) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_wang7776__Mistral-7B-Instruct-v0.2-sparsity-30",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-15T11:11:23.952137](https://huggingface.co/datasets/open-llm-leaderboard/details_wang7776__Mistral-7B-Instruct-v0.2-sparsity-30/blob/main/results_2024-01-15T11-11-23.952137.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.46463869066130487,
"acc_stderr": 0.034455801387647846,
"acc_norm": 0.4711053080225253,
"acc_norm_stderr": 0.035249688625421514,
"mc1": 0.2998776009791922,
"mc1_stderr": 0.016040352966713623,
"mc2": 0.4553313356020083,
"mc2_stderr": 0.01500792603148901
},
"harness|arc:challenge|25": {
"acc": 0.45563139931740615,
"acc_stderr": 0.014553749939306864,
"acc_norm": 0.5110921501706485,
"acc_norm_stderr": 0.014607794914013048
},
"harness|hellaswag|10": {
"acc": 0.5652260505875324,
"acc_stderr": 0.004947141797384131,
"acc_norm": 0.7572196773551085,
"acc_norm_stderr": 0.004278871104930366
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45185185185185184,
"acc_stderr": 0.04299268905480864,
"acc_norm": 0.45185185185185184,
"acc_norm_stderr": 0.04299268905480864
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.46710526315789475,
"acc_stderr": 0.040601270352363966,
"acc_norm": 0.46710526315789475,
"acc_norm_stderr": 0.040601270352363966
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5207547169811321,
"acc_stderr": 0.030746349975723463,
"acc_norm": 0.5207547169811321,
"acc_norm_stderr": 0.030746349975723463
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4305555555555556,
"acc_stderr": 0.04140685639111503,
"acc_norm": 0.4305555555555556,
"acc_norm_stderr": 0.04140685639111503
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4682080924855491,
"acc_stderr": 0.03804749744364763,
"acc_norm": 0.4682080924855491,
"acc_norm_stderr": 0.03804749744364763
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.04220773659171451,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.04220773659171451
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.33617021276595743,
"acc_stderr": 0.030881618520676942,
"acc_norm": 0.33617021276595743,
"acc_norm_stderr": 0.030881618520676942
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.040493392977481425,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.040493392977481425
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4068965517241379,
"acc_stderr": 0.04093793981266237,
"acc_norm": 0.4068965517241379,
"acc_norm_stderr": 0.04093793981266237
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.023068188848261135,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.023068188848261135
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.0404061017820884,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.0404061017820884
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5225806451612903,
"acc_stderr": 0.028414985019707868,
"acc_norm": 0.5225806451612903,
"acc_norm_stderr": 0.028414985019707868
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.35467980295566504,
"acc_stderr": 0.03366124489051449,
"acc_norm": 0.35467980295566504,
"acc_norm_stderr": 0.03366124489051449
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5636363636363636,
"acc_stderr": 0.03872592983524754,
"acc_norm": 0.5636363636363636,
"acc_norm_stderr": 0.03872592983524754
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.0347327959083696,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.0347327959083696
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6580310880829016,
"acc_stderr": 0.034234651001042844,
"acc_norm": 0.6580310880829016,
"acc_norm_stderr": 0.034234651001042844
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.47435897435897434,
"acc_stderr": 0.025317649726448656,
"acc_norm": 0.47435897435897434,
"acc_norm_stderr": 0.025317649726448656
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.02671924078371216,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.02671924078371216
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.031968769891957786,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.031968769891957786
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526732,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526732
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6146788990825688,
"acc_stderr": 0.020865850852794122,
"acc_norm": 0.6146788990825688,
"acc_norm_stderr": 0.020865850852794122
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.47685185185185186,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.47685185185185186,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5980392156862745,
"acc_stderr": 0.03441190023482465,
"acc_norm": 0.5980392156862745,
"acc_norm_stderr": 0.03441190023482465
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5991561181434599,
"acc_stderr": 0.031900803894732356,
"acc_norm": 0.5991561181434599,
"acc_norm_stderr": 0.031900803894732356
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5246636771300448,
"acc_stderr": 0.03351695167652628,
"acc_norm": 0.5246636771300448,
"acc_norm_stderr": 0.03351695167652628
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.549618320610687,
"acc_stderr": 0.04363643698524779,
"acc_norm": 0.549618320610687,
"acc_norm_stderr": 0.04363643698524779
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.043913262867240704,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.043913262867240704
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5740740740740741,
"acc_stderr": 0.0478034362693679,
"acc_norm": 0.5740740740740741,
"acc_norm_stderr": 0.0478034362693679
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5337423312883436,
"acc_stderr": 0.03919415545048409,
"acc_norm": 0.5337423312883436,
"acc_norm_stderr": 0.03919415545048409
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.33035714285714285,
"acc_stderr": 0.044642857142857144,
"acc_norm": 0.33035714285714285,
"acc_norm_stderr": 0.044642857142857144
},
"harness|hendrycksTest-management|5": {
"acc": 0.6019417475728155,
"acc_stderr": 0.04846748253977239,
"acc_norm": 0.6019417475728155,
"acc_norm_stderr": 0.04846748253977239
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6709401709401709,
"acc_stderr": 0.03078232157768817,
"acc_norm": 0.6709401709401709,
"acc_norm_stderr": 0.03078232157768817
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6347381864623244,
"acc_stderr": 0.01721853002883864,
"acc_norm": 0.6347381864623244,
"acc_norm_stderr": 0.01721853002883864
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4913294797687861,
"acc_stderr": 0.026915047355369804,
"acc_norm": 0.4913294797687861,
"acc_norm_stderr": 0.026915047355369804
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.02845263998508801,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.02845263998508801
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5401929260450161,
"acc_stderr": 0.028306190403305696,
"acc_norm": 0.5401929260450161,
"acc_norm_stderr": 0.028306190403305696
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.558641975308642,
"acc_stderr": 0.027628737155668773,
"acc_norm": 0.558641975308642,
"acc_norm_stderr": 0.027628737155668773
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3617021276595745,
"acc_stderr": 0.028663820147199492,
"acc_norm": 0.3617021276595745,
"acc_norm_stderr": 0.028663820147199492
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.33376792698826596,
"acc_stderr": 0.012043812655846142,
"acc_norm": 0.33376792698826596,
"acc_norm_stderr": 0.012043812655846142
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.45955882352941174,
"acc_stderr": 0.030273325077345755,
"acc_norm": 0.45955882352941174,
"acc_norm_stderr": 0.030273325077345755
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4297385620915033,
"acc_stderr": 0.02002712278492854,
"acc_norm": 0.4297385620915033,
"acc_norm_stderr": 0.02002712278492854
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5,
"acc_stderr": 0.04789131426105757,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04789131426105757
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5551020408163265,
"acc_stderr": 0.031814251181977865,
"acc_norm": 0.5551020408163265,
"acc_norm_stderr": 0.031814251181977865
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6318407960199005,
"acc_stderr": 0.03410410565495301,
"acc_norm": 0.6318407960199005,
"acc_norm_stderr": 0.03410410565495301
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.64,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.64,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-virology|5": {
"acc": 0.37349397590361444,
"acc_stderr": 0.03765845117168862,
"acc_norm": 0.37349397590361444,
"acc_norm_stderr": 0.03765845117168862
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6257309941520468,
"acc_stderr": 0.03711601185389481,
"acc_norm": 0.6257309941520468,
"acc_norm_stderr": 0.03711601185389481
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2998776009791922,
"mc1_stderr": 0.016040352966713623,
"mc2": 0.4553313356020083,
"mc2_stderr": 0.01500792603148901
},
"harness|winogrande|5": {
"acc": 0.6898184688239937,
"acc_stderr": 0.013000454144859893
},
"harness|gsm8k|5": {
"acc": 0.10538286580742987,
"acc_stderr": 0.00845757588404174
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_wang7776__Mistral-7B-Instruct-v0.2-sparsity-30 | [
"region:us"
] | 2024-01-15T11:13:12+00:00 | {"pretty_name": "Evaluation run of wang7776/Mistral-7B-Instruct-v0.2-sparsity-30", "dataset_summary": "Dataset automatically created during the evaluation run of model [wang7776/Mistral-7B-Instruct-v0.2-sparsity-30](https://huggingface.co/wang7776/Mistral-7B-Instruct-v0.2-sparsity-30) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_wang7776__Mistral-7B-Instruct-v0.2-sparsity-30\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-15T11:11:23.952137](https://huggingface.co/datasets/open-llm-leaderboard/details_wang7776__Mistral-7B-Instruct-v0.2-sparsity-30/blob/main/results_2024-01-15T11-11-23.952137.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.46463869066130487,\n \"acc_stderr\": 0.034455801387647846,\n \"acc_norm\": 0.4711053080225253,\n \"acc_norm_stderr\": 0.035249688625421514,\n \"mc1\": 0.2998776009791922,\n \"mc1_stderr\": 0.016040352966713623,\n \"mc2\": 0.4553313356020083,\n \"mc2_stderr\": 0.01500792603148901\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.45563139931740615,\n \"acc_stderr\": 0.014553749939306864,\n \"acc_norm\": 0.5110921501706485,\n \"acc_norm_stderr\": 0.014607794914013048\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5652260505875324,\n \"acc_stderr\": 0.004947141797384131,\n \"acc_norm\": 0.7572196773551085,\n \"acc_norm_stderr\": 0.004278871104930366\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45185185185185184,\n \"acc_stderr\": 0.04299268905480864,\n \"acc_norm\": 0.45185185185185184,\n \"acc_norm_stderr\": 0.04299268905480864\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.46710526315789475,\n \"acc_stderr\": 0.040601270352363966,\n \"acc_norm\": 0.46710526315789475,\n \"acc_norm_stderr\": 0.040601270352363966\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5207547169811321,\n \"acc_stderr\": 0.030746349975723463,\n \"acc_norm\": 0.5207547169811321,\n \"acc_norm_stderr\": 0.030746349975723463\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4305555555555556,\n \"acc_stderr\": 0.04140685639111503,\n \"acc_norm\": 0.4305555555555556,\n \"acc_norm_stderr\": 0.04140685639111503\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4682080924855491,\n \"acc_stderr\": 0.03804749744364763,\n \"acc_norm\": 0.4682080924855491,\n \"acc_norm_stderr\": 0.03804749744364763\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171451,\n \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171451\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.33617021276595743,\n \"acc_stderr\": 0.030881618520676942,\n \"acc_norm\": 0.33617021276595743,\n \"acc_norm_stderr\": 0.030881618520676942\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n \"acc_stderr\": 0.040493392977481425,\n \"acc_norm\": 0.24561403508771928,\n \"acc_norm_stderr\": 0.040493392977481425\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.4068965517241379,\n \"acc_stderr\": 0.04093793981266237,\n \"acc_norm\": 0.4068965517241379,\n \"acc_norm_stderr\": 0.04093793981266237\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.023068188848261135,\n \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.023068188848261135\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.0404061017820884,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.0404061017820884\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5225806451612903,\n \"acc_stderr\": 0.028414985019707868,\n \"acc_norm\": 0.5225806451612903,\n \"acc_norm_stderr\": 0.028414985019707868\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.35467980295566504,\n \"acc_stderr\": 0.03366124489051449,\n \"acc_norm\": 0.35467980295566504,\n \"acc_norm_stderr\": 0.03366124489051449\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.5636363636363636,\n \"acc_stderr\": 0.03872592983524754,\n \"acc_norm\": 0.5636363636363636,\n \"acc_norm_stderr\": 0.03872592983524754\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.0347327959083696,\n \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.0347327959083696\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.6580310880829016,\n \"acc_stderr\": 0.034234651001042844,\n \"acc_norm\": 0.6580310880829016,\n \"acc_norm_stderr\": 0.034234651001042844\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.47435897435897434,\n \"acc_stderr\": 0.025317649726448656,\n \"acc_norm\": 0.47435897435897434,\n \"acc_norm_stderr\": 0.025317649726448656\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.02671924078371216,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.02671924078371216\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.031968769891957786,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.031968769891957786\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526732,\n \"acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526732\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.6146788990825688,\n \"acc_stderr\": 0.020865850852794122,\n \"acc_norm\": 0.6146788990825688,\n \"acc_norm_stderr\": 0.020865850852794122\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.47685185185185186,\n \"acc_stderr\": 0.03406315360711507,\n \"acc_norm\": 0.47685185185185186,\n \"acc_norm_stderr\": 0.03406315360711507\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.5980392156862745,\n \"acc_stderr\": 0.03441190023482465,\n \"acc_norm\": 0.5980392156862745,\n \"acc_norm_stderr\": 0.03441190023482465\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.5991561181434599,\n \"acc_stderr\": 0.031900803894732356,\n \"acc_norm\": 0.5991561181434599,\n \"acc_norm_stderr\": 0.031900803894732356\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5246636771300448,\n \"acc_stderr\": 0.03351695167652628,\n \"acc_norm\": 0.5246636771300448,\n \"acc_norm_stderr\": 0.03351695167652628\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.549618320610687,\n \"acc_stderr\": 0.04363643698524779,\n \"acc_norm\": 0.549618320610687,\n \"acc_norm_stderr\": 0.04363643698524779\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.043913262867240704,\n \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.043913262867240704\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5740740740740741,\n \"acc_stderr\": 0.0478034362693679,\n \"acc_norm\": 0.5740740740740741,\n \"acc_norm_stderr\": 0.0478034362693679\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.5337423312883436,\n \"acc_stderr\": 0.03919415545048409,\n \"acc_norm\": 0.5337423312883436,\n \"acc_norm_stderr\": 0.03919415545048409\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n \"acc_stderr\": 0.044642857142857144,\n \"acc_norm\": 0.33035714285714285,\n \"acc_norm_stderr\": 0.044642857142857144\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6019417475728155,\n \"acc_stderr\": 0.04846748253977239,\n \"acc_norm\": 0.6019417475728155,\n \"acc_norm_stderr\": 0.04846748253977239\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6709401709401709,\n \"acc_stderr\": 0.03078232157768817,\n \"acc_norm\": 0.6709401709401709,\n \"acc_norm_stderr\": 0.03078232157768817\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6347381864623244,\n \"acc_stderr\": 0.01721853002883864,\n \"acc_norm\": 0.6347381864623244,\n \"acc_norm_stderr\": 0.01721853002883864\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.4913294797687861,\n \"acc_stderr\": 0.026915047355369804,\n \"acc_norm\": 0.4913294797687861,\n \"acc_norm_stderr\": 0.026915047355369804\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.02845263998508801,\n \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.02845263998508801\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5401929260450161,\n \"acc_stderr\": 0.028306190403305696,\n \"acc_norm\": 0.5401929260450161,\n \"acc_norm_stderr\": 0.028306190403305696\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.558641975308642,\n \"acc_stderr\": 0.027628737155668773,\n \"acc_norm\": 0.558641975308642,\n \"acc_norm_stderr\": 0.027628737155668773\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3617021276595745,\n \"acc_stderr\": 0.028663820147199492,\n \"acc_norm\": 0.3617021276595745,\n \"acc_norm_stderr\": 0.028663820147199492\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.33376792698826596,\n \"acc_stderr\": 0.012043812655846142,\n \"acc_norm\": 0.33376792698826596,\n \"acc_norm_stderr\": 0.012043812655846142\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.45955882352941174,\n \"acc_stderr\": 0.030273325077345755,\n \"acc_norm\": 0.45955882352941174,\n \"acc_norm_stderr\": 0.030273325077345755\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.4297385620915033,\n \"acc_stderr\": 0.02002712278492854,\n \"acc_norm\": 0.4297385620915033,\n \"acc_norm_stderr\": 0.02002712278492854\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04789131426105757,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04789131426105757\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.5551020408163265,\n \"acc_stderr\": 0.031814251181977865,\n \"acc_norm\": 0.5551020408163265,\n \"acc_norm_stderr\": 0.031814251181977865\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6318407960199005,\n \"acc_stderr\": 0.03410410565495301,\n \"acc_norm\": 0.6318407960199005,\n \"acc_norm_stderr\": 0.03410410565495301\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.37349397590361444,\n \"acc_stderr\": 0.03765845117168862,\n \"acc_norm\": 0.37349397590361444,\n \"acc_norm_stderr\": 0.03765845117168862\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.6257309941520468,\n \"acc_stderr\": 0.03711601185389481,\n \"acc_norm\": 0.6257309941520468,\n \"acc_norm_stderr\": 0.03711601185389481\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2998776009791922,\n \"mc1_stderr\": 0.016040352966713623,\n \"mc2\": 0.4553313356020083,\n \"mc2_stderr\": 0.01500792603148901\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6898184688239937,\n \"acc_stderr\": 0.013000454144859893\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.10538286580742987,\n \"acc_stderr\": 0.00845757588404174\n }\n}\n```", "repo_url": "https://huggingface.co/wang7776/Mistral-7B-Instruct-v0.2-sparsity-30", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_15T11_11_23.952137", "path": ["**/details_harness|arc:challenge|25_2024-01-15T11-11-23.952137.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-15T11-11-23.952137.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_15T11_11_23.952137", "path": ["**/details_harness|gsm8k|5_2024-01-15T11-11-23.952137.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-15T11-11-23.952137.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_15T11_11_23.952137", "path": ["**/details_harness|hellaswag|10_2024-01-15T11-11-23.952137.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-15T11-11-23.952137.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_15T11_11_23.952137", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-15T11-11-23.952137.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-15T11-11-23.952137.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-15T11-11-23.952137.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_15T11_11_23.952137", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T11-11-23.952137.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T11-11-23.952137.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_15T11_11_23.952137", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-15T11-11-23.952137.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-15T11-11-23.952137.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_15T11_11_23.952137", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-15T11-11-23.952137.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-15T11-11-23.952137.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_15T11_11_23.952137", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T11-11-23.952137.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T11-11-23.952137.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_15T11_11_23.952137", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T11-11-23.952137.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T11-11-23.952137.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_15T11_11_23.952137", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-15T11-11-23.952137.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-15T11-11-23.952137.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_15T11_11_23.952137", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T11-11-23.952137.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T11-11-23.952137.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_15T11_11_23.952137", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T11-11-23.952137.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T11-11-23.952137.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_15T11_11_23.952137", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T11-11-23.952137.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T11-11-23.952137.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_15T11_11_23.952137", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T11-11-23.952137.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T11-11-23.952137.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_15T11_11_23.952137", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-15T11-11-23.952137.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-15T11-11-23.952137.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_15T11_11_23.952137", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-15T11-11-23.952137.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-15T11-11-23.952137.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_15T11_11_23.952137", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T11-11-23.952137.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T11-11-23.952137.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_15T11_11_23.952137", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-15T11-11-23.952137.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-15T11-11-23.952137.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_15T11_11_23.952137", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T11-11-23.952137.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T11-11-23.952137.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_15T11_11_23.952137", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T11-11-23.952137.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T11-11-23.952137.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_15T11_11_23.952137", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T11-11-23.952137.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T11-11-23.952137.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_15T11_11_23.952137", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-15T11-11-23.952137.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-15T11-11-23.952137.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_15T11_11_23.952137", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T11-11-23.952137.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T11-11-23.952137.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_15T11_11_23.952137", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T11-11-23.952137.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T11-11-23.952137.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_15T11_11_23.952137", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T11-11-23.952137.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T11-11-23.952137.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_15T11_11_23.952137", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T11-11-23.952137.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T11-11-23.952137.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_15T11_11_23.952137", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T11-11-23.952137.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T11-11-23.952137.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_15T11_11_23.952137", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T11-11-23.952137.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T11-11-23.952137.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_15T11_11_23.952137", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T11-11-23.952137.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T11-11-23.952137.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_15T11_11_23.952137", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T11-11-23.952137.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T11-11-23.952137.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_15T11_11_23.952137", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T11-11-23.952137.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T11-11-23.952137.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_15T11_11_23.952137", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T11-11-23.952137.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T11-11-23.952137.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_15T11_11_23.952137", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T11-11-23.952137.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T11-11-23.952137.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_15T11_11_23.952137", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T11-11-23.952137.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T11-11-23.952137.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_15T11_11_23.952137", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T11-11-23.952137.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T11-11-23.952137.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_15T11_11_23.952137", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T11-11-23.952137.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T11-11-23.952137.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_15T11_11_23.952137", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-15T11-11-23.952137.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-15T11-11-23.952137.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_15T11_11_23.952137", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T11-11-23.952137.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T11-11-23.952137.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_15T11_11_23.952137", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-15T11-11-23.952137.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-15T11-11-23.952137.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_15T11_11_23.952137", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T11-11-23.952137.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T11-11-23.952137.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_15T11_11_23.952137", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T11-11-23.952137.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T11-11-23.952137.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_15T11_11_23.952137", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T11-11-23.952137.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T11-11-23.952137.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_15T11_11_23.952137", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-15T11-11-23.952137.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-15T11-11-23.952137.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_15T11_11_23.952137", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-15T11-11-23.952137.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-15T11-11-23.952137.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_15T11_11_23.952137", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T11-11-23.952137.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T11-11-23.952137.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_15T11_11_23.952137", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T11-11-23.952137.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T11-11-23.952137.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_15T11_11_23.952137", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T11-11-23.952137.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T11-11-23.952137.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_15T11_11_23.952137", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T11-11-23.952137.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T11-11-23.952137.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_15T11_11_23.952137", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-15T11-11-23.952137.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-15T11-11-23.952137.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_15T11_11_23.952137", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-15T11-11-23.952137.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-15T11-11-23.952137.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_15T11_11_23.952137", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-15T11-11-23.952137.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-15T11-11-23.952137.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_15T11_11_23.952137", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T11-11-23.952137.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T11-11-23.952137.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_15T11_11_23.952137", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-15T11-11-23.952137.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-15T11-11-23.952137.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_15T11_11_23.952137", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T11-11-23.952137.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T11-11-23.952137.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_15T11_11_23.952137", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T11-11-23.952137.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T11-11-23.952137.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_15T11_11_23.952137", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-15T11-11-23.952137.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-15T11-11-23.952137.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_15T11_11_23.952137", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-15T11-11-23.952137.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-15T11-11-23.952137.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_15T11_11_23.952137", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-15T11-11-23.952137.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-15T11-11-23.952137.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_15T11_11_23.952137", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T11-11-23.952137.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T11-11-23.952137.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_15T11_11_23.952137", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-15T11-11-23.952137.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-15T11-11-23.952137.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_15T11_11_23.952137", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-15T11-11-23.952137.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-15T11-11-23.952137.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_15T11_11_23.952137", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-15T11-11-23.952137.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-15T11-11-23.952137.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_15T11_11_23.952137", "path": ["**/details_harness|winogrande|5_2024-01-15T11-11-23.952137.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-15T11-11-23.952137.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_15T11_11_23.952137", "path": ["results_2024-01-15T11-11-23.952137.parquet"]}, {"split": "latest", "path": ["results_2024-01-15T11-11-23.952137.parquet"]}]}]} | 2024-01-15T11:13:32+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of wang7776/Mistral-7B-Instruct-v0.2-sparsity-30
Dataset automatically created during the evaluation run of model wang7776/Mistral-7B-Instruct-v0.2-sparsity-30 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-15T11:11:23.952137(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of wang7776/Mistral-7B-Instruct-v0.2-sparsity-30\n\n\n\nDataset automatically created during the evaluation run of model wang7776/Mistral-7B-Instruct-v0.2-sparsity-30 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-15T11:11:23.952137(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of wang7776/Mistral-7B-Instruct-v0.2-sparsity-30\n\n\n\nDataset automatically created during the evaluation run of model wang7776/Mistral-7B-Instruct-v0.2-sparsity-30 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-15T11:11:23.952137(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
1500792cd6998308da046ba72b688156b2ffa471 |
# CommitBench: A Benchmark for Commit Message Generation
## EXECUTIVE SUMMARY
We provide CommitBench as an open-source, reproducible and privacy- and license-aware benchmark for commit message generation. The dataset is gathered from GitHub repositories with licenses that permit redistribution. We provide six programming languages, Java, Python, Go, JavaScript, PHP, and Ruby. The commit messages in natural language are restricted to English, as it is the working language in many software development projects. The dataset has 1,664,590 examples that were generated by using extensive quality-focused filtering techniques (e.g., excluding bot commits). Additionally, we provide a version with longer sequences for benchmarking models with more extended sequence input.
## CURATION RATIONALE
We created this dataset due to quality and legal issues with previous commit message generation datasets. Given a git diff displaying code changes between two file versions, the task is to predict the accompanying commit message describing these changes in natural language. We base our GitHub repository selection on that of a previous dataset, CodeSearchNet, but apply a large number of filtering techniques to improve the data quality and eliminate noise. Due to the original repository selection, we are also restricted to the aforementioned programming languages. It was important to us, however, to provide some number of programming languages to accommodate any changes in the task due to the degree of hardware-relatedness of a language. The dataset is provided as a large CSV file containing all samples. We provide the following fields: Diff, Commit Message, Hash, Project, Split.
## DOCUMENTATION FOR SOURCE DATASETS
Repository selection based on CodeSearchNet, which can be found under [https://github.com/github/CodeSearchNet](https://github.com/github/CodeSearchNet).
## LANGUAGE VARIETIES
Since GitHub hosts software projects from all over the world, there is no single uniform variety of English used across all commit messages. This means that phrasing can be regional or subject to influences from the programmer's native language. It also means that different spelling conventions may co-exist and that different terms may be used for the same concept. Any model trained on this data should take these factors into account.
### Overview of split by programming language for CommitBench:
- Java: 153,119
- Ruby: 233,710
- Go: 137,998
- JavaScript: 373,598
- Python: 472,469
- PHP: 294,394
## SPEAKER DEMOGRAPHIC
Due to the extremely diverse (geographically, but also socio-economically) backgrounds of the software development community, there is no single demographic the data comes from. Globally, the average software developer tends to be male and has obtained higher education. Due to the anonymous nature of GitHub profiles, gender distribution information cannot be extracted.
## ANNOTATOR DEMOGRAPHIC
Due to the automated generation of the dataset, no annotators were used.
## SPEECH SITUATION AND CHARACTERISTICS
The public nature and often business-related creation of the data by the original GitHub users fosters a more neutral, information-focused, and formal language. As it is not uncommon for developers to find the writing of commit messages tedious, there can also be commit messages representing the frustration or boredom of the commit author. While our filtering is supposed to catch these types of messages, there can be some instances still in the dataset.
## PREPROCESSING AND DATA FORMATTING
See our paper for all preprocessing steps. We do not provide the un-processed raw data due to privacy concerns, but it can be obtained via CodeSearchNet or requested from the authors.
## CAPTURE QUALITY
While our dataset is completely reproducible at the time of writing, there are external dependencies that could restrict this. If GitHub shuts down and someone with a software project in the dataset deletes their repository, there can be instances that are non-reproducible.
## LIMITATIONS
While our filters are meant to ensure a high quality for each data sample in the dataset, we cannot ensure that only low-quality examples were removed. Similarly, we cannot guarantee that our extensive filtering methods catch all low-quality examples. Some might remain in the dataset. Another limitation of our dataset is the low number of programming languages (there are many more) as well as our focus on English commit messages.
## METADATA
- **License:** Dataset under the CC BY-NC 4.0 license, code under the MIT license
## DISCLOSURES AND ETHICAL REVIEW
While we put substantial effort into removing privacy-sensitive information, our solutions cannot find 100% of such cases. This means that researchers and anyone using the data need to incorporate their own safeguards to effectively reduce the amount of personal information that can be exposed.
## ABOUT THIS DOCUMENT
A data statement is a characterization of a dataset that provides context to allow developers and users to better understand how experimental results might generalize, how software might be appropriately deployed, and what biases might be reflected in systems built on the software.
This data statement was written based on the template for the Data Statements Version 2 schema. The template was prepared by Angelina McMillan-Major, Emily M. Bender, and Batya Friedman and can be found at [https://techpolicylab.uw.edu/data-statements/](https://techpolicylab.uw.edu/data-statements/) and was updated from the community Version 1 Markdown template by Leon Derczynski.
| Maxscha/commitbench | [
"size_categories:1M<n<10M",
"language:en",
"license:cc-by-nc-4.0",
"code",
"region:us"
] | 2024-01-15T11:17:17+00:00 | {"language": ["en"], "license": "cc-by-nc-4.0", "size_categories": ["1M<n<10M"], "tags": ["code"]} | 2024-02-14T11:19:43+00:00 | [] | [
"en"
] | TAGS
#size_categories-1M<n<10M #language-English #license-cc-by-nc-4.0 #code #region-us
|
# CommitBench: A Benchmark for Commit Message Generation
## EXECUTIVE SUMMARY
We provide CommitBench as an open-source, reproducible and privacy- and license-aware benchmark for commit message generation. The dataset is gathered from GitHub repositories with licenses that permit redistribution. We provide six programming languages, Java, Python, Go, JavaScript, PHP, and Ruby. The commit messages in natural language are restricted to English, as it is the working language in many software development projects. The dataset has 1,664,590 examples that were generated by using extensive quality-focused filtering techniques (e.g., excluding bot commits). Additionally, we provide a version with longer sequences for benchmarking models with more extended sequence input.
## CURATION RATIONALE
We created this dataset due to quality and legal issues with previous commit message generation datasets. Given a git diff displaying code changes between two file versions, the task is to predict the accompanying commit message describing these changes in natural language. We base our GitHub repository selection on that of a previous dataset, CodeSearchNet, but apply a large number of filtering techniques to improve the data quality and eliminate noise. Due to the original repository selection, we are also restricted to the aforementioned programming languages. It was important to us, however, to provide some number of programming languages to accommodate any changes in the task due to the degree of hardware-relatedness of a language. The dataset is provided as a large CSV file containing all samples. We provide the following fields: Diff, Commit Message, Hash, Project, Split.
## DOCUMENTATION FOR SOURCE DATASETS
Repository selection based on CodeSearchNet, which can be found under URL
## LANGUAGE VARIETIES
Since GitHub hosts software projects from all over the world, there is no single uniform variety of English used across all commit messages. This means that phrasing can be regional or subject to influences from the programmer's native language. It also means that different spelling conventions may co-exist and that different terms may be used for the same concept. Any model trained on this data should take these factors into account.
### Overview of split by programming language for CommitBench:
- Java: 153,119
- Ruby: 233,710
- Go: 137,998
- JavaScript: 373,598
- Python: 472,469
- PHP: 294,394
## SPEAKER DEMOGRAPHIC
Due to the extremely diverse (geographically, but also socio-economically) backgrounds of the software development community, there is no single demographic the data comes from. Globally, the average software developer tends to be male and has obtained higher education. Due to the anonymous nature of GitHub profiles, gender distribution information cannot be extracted.
## ANNOTATOR DEMOGRAPHIC
Due to the automated generation of the dataset, no annotators were used.
## SPEECH SITUATION AND CHARACTERISTICS
The public nature and often business-related creation of the data by the original GitHub users fosters a more neutral, information-focused, and formal language. As it is not uncommon for developers to find the writing of commit messages tedious, there can also be commit messages representing the frustration or boredom of the commit author. While our filtering is supposed to catch these types of messages, there can be some instances still in the dataset.
## PREPROCESSING AND DATA FORMATTING
See our paper for all preprocessing steps. We do not provide the un-processed raw data due to privacy concerns, but it can be obtained via CodeSearchNet or requested from the authors.
## CAPTURE QUALITY
While our dataset is completely reproducible at the time of writing, there are external dependencies that could restrict this. If GitHub shuts down and someone with a software project in the dataset deletes their repository, there can be instances that are non-reproducible.
## LIMITATIONS
While our filters are meant to ensure a high quality for each data sample in the dataset, we cannot ensure that only low-quality examples were removed. Similarly, we cannot guarantee that our extensive filtering methods catch all low-quality examples. Some might remain in the dataset. Another limitation of our dataset is the low number of programming languages (there are many more) as well as our focus on English commit messages.
## METADATA
- License: Dataset under the CC BY-NC 4.0 license, code under the MIT license
## DISCLOSURES AND ETHICAL REVIEW
While we put substantial effort into removing privacy-sensitive information, our solutions cannot find 100% of such cases. This means that researchers and anyone using the data need to incorporate their own safeguards to effectively reduce the amount of personal information that can be exposed.
## ABOUT THIS DOCUMENT
A data statement is a characterization of a dataset that provides context to allow developers and users to better understand how experimental results might generalize, how software might be appropriately deployed, and what biases might be reflected in systems built on the software.
This data statement was written based on the template for the Data Statements Version 2 schema. The template was prepared by Angelina McMillan-Major, Emily M. Bender, and Batya Friedman and can be found at URL and was updated from the community Version 1 Markdown template by Leon Derczynski.
| [
"# CommitBench: A Benchmark for Commit Message Generation",
"## EXECUTIVE SUMMARY\nWe provide CommitBench as an open-source, reproducible and privacy- and license-aware benchmark for commit message generation. The dataset is gathered from GitHub repositories with licenses that permit redistribution. We provide six programming languages, Java, Python, Go, JavaScript, PHP, and Ruby. The commit messages in natural language are restricted to English, as it is the working language in many software development projects. The dataset has 1,664,590 examples that were generated by using extensive quality-focused filtering techniques (e.g., excluding bot commits). Additionally, we provide a version with longer sequences for benchmarking models with more extended sequence input.",
"## CURATION RATIONALE\nWe created this dataset due to quality and legal issues with previous commit message generation datasets. Given a git diff displaying code changes between two file versions, the task is to predict the accompanying commit message describing these changes in natural language. We base our GitHub repository selection on that of a previous dataset, CodeSearchNet, but apply a large number of filtering techniques to improve the data quality and eliminate noise. Due to the original repository selection, we are also restricted to the aforementioned programming languages. It was important to us, however, to provide some number of programming languages to accommodate any changes in the task due to the degree of hardware-relatedness of a language. The dataset is provided as a large CSV file containing all samples. We provide the following fields: Diff, Commit Message, Hash, Project, Split.",
"## DOCUMENTATION FOR SOURCE DATASETS\nRepository selection based on CodeSearchNet, which can be found under URL",
"## LANGUAGE VARIETIES\nSince GitHub hosts software projects from all over the world, there is no single uniform variety of English used across all commit messages. This means that phrasing can be regional or subject to influences from the programmer's native language. It also means that different spelling conventions may co-exist and that different terms may be used for the same concept. Any model trained on this data should take these factors into account.",
"### Overview of split by programming language for CommitBench:\n- Java: 153,119\n- Ruby: 233,710\n- Go: 137,998\n- JavaScript: 373,598\n- Python: 472,469\n- PHP: 294,394",
"## SPEAKER DEMOGRAPHIC\nDue to the extremely diverse (geographically, but also socio-economically) backgrounds of the software development community, there is no single demographic the data comes from. Globally, the average software developer tends to be male and has obtained higher education. Due to the anonymous nature of GitHub profiles, gender distribution information cannot be extracted.",
"## ANNOTATOR DEMOGRAPHIC\nDue to the automated generation of the dataset, no annotators were used.",
"## SPEECH SITUATION AND CHARACTERISTICS\nThe public nature and often business-related creation of the data by the original GitHub users fosters a more neutral, information-focused, and formal language. As it is not uncommon for developers to find the writing of commit messages tedious, there can also be commit messages representing the frustration or boredom of the commit author. While our filtering is supposed to catch these types of messages, there can be some instances still in the dataset.",
"## PREPROCESSING AND DATA FORMATTING\nSee our paper for all preprocessing steps. We do not provide the un-processed raw data due to privacy concerns, but it can be obtained via CodeSearchNet or requested from the authors.",
"## CAPTURE QUALITY\nWhile our dataset is completely reproducible at the time of writing, there are external dependencies that could restrict this. If GitHub shuts down and someone with a software project in the dataset deletes their repository, there can be instances that are non-reproducible.",
"## LIMITATIONS\nWhile our filters are meant to ensure a high quality for each data sample in the dataset, we cannot ensure that only low-quality examples were removed. Similarly, we cannot guarantee that our extensive filtering methods catch all low-quality examples. Some might remain in the dataset. Another limitation of our dataset is the low number of programming languages (there are many more) as well as our focus on English commit messages.",
"## METADATA\n- License: Dataset under the CC BY-NC 4.0 license, code under the MIT license",
"## DISCLOSURES AND ETHICAL REVIEW\nWhile we put substantial effort into removing privacy-sensitive information, our solutions cannot find 100% of such cases. This means that researchers and anyone using the data need to incorporate their own safeguards to effectively reduce the amount of personal information that can be exposed.",
"## ABOUT THIS DOCUMENT\nA data statement is a characterization of a dataset that provides context to allow developers and users to better understand how experimental results might generalize, how software might be appropriately deployed, and what biases might be reflected in systems built on the software.\n\nThis data statement was written based on the template for the Data Statements Version 2 schema. The template was prepared by Angelina McMillan-Major, Emily M. Bender, and Batya Friedman and can be found at URL and was updated from the community Version 1 Markdown template by Leon Derczynski."
] | [
"TAGS\n#size_categories-1M<n<10M #language-English #license-cc-by-nc-4.0 #code #region-us \n",
"# CommitBench: A Benchmark for Commit Message Generation",
"## EXECUTIVE SUMMARY\nWe provide CommitBench as an open-source, reproducible and privacy- and license-aware benchmark for commit message generation. The dataset is gathered from GitHub repositories with licenses that permit redistribution. We provide six programming languages, Java, Python, Go, JavaScript, PHP, and Ruby. The commit messages in natural language are restricted to English, as it is the working language in many software development projects. The dataset has 1,664,590 examples that were generated by using extensive quality-focused filtering techniques (e.g., excluding bot commits). Additionally, we provide a version with longer sequences for benchmarking models with more extended sequence input.",
"## CURATION RATIONALE\nWe created this dataset due to quality and legal issues with previous commit message generation datasets. Given a git diff displaying code changes between two file versions, the task is to predict the accompanying commit message describing these changes in natural language. We base our GitHub repository selection on that of a previous dataset, CodeSearchNet, but apply a large number of filtering techniques to improve the data quality and eliminate noise. Due to the original repository selection, we are also restricted to the aforementioned programming languages. It was important to us, however, to provide some number of programming languages to accommodate any changes in the task due to the degree of hardware-relatedness of a language. The dataset is provided as a large CSV file containing all samples. We provide the following fields: Diff, Commit Message, Hash, Project, Split.",
"## DOCUMENTATION FOR SOURCE DATASETS\nRepository selection based on CodeSearchNet, which can be found under URL",
"## LANGUAGE VARIETIES\nSince GitHub hosts software projects from all over the world, there is no single uniform variety of English used across all commit messages. This means that phrasing can be regional or subject to influences from the programmer's native language. It also means that different spelling conventions may co-exist and that different terms may be used for the same concept. Any model trained on this data should take these factors into account.",
"### Overview of split by programming language for CommitBench:\n- Java: 153,119\n- Ruby: 233,710\n- Go: 137,998\n- JavaScript: 373,598\n- Python: 472,469\n- PHP: 294,394",
"## SPEAKER DEMOGRAPHIC\nDue to the extremely diverse (geographically, but also socio-economically) backgrounds of the software development community, there is no single demographic the data comes from. Globally, the average software developer tends to be male and has obtained higher education. Due to the anonymous nature of GitHub profiles, gender distribution information cannot be extracted.",
"## ANNOTATOR DEMOGRAPHIC\nDue to the automated generation of the dataset, no annotators were used.",
"## SPEECH SITUATION AND CHARACTERISTICS\nThe public nature and often business-related creation of the data by the original GitHub users fosters a more neutral, information-focused, and formal language. As it is not uncommon for developers to find the writing of commit messages tedious, there can also be commit messages representing the frustration or boredom of the commit author. While our filtering is supposed to catch these types of messages, there can be some instances still in the dataset.",
"## PREPROCESSING AND DATA FORMATTING\nSee our paper for all preprocessing steps. We do not provide the un-processed raw data due to privacy concerns, but it can be obtained via CodeSearchNet or requested from the authors.",
"## CAPTURE QUALITY\nWhile our dataset is completely reproducible at the time of writing, there are external dependencies that could restrict this. If GitHub shuts down and someone with a software project in the dataset deletes their repository, there can be instances that are non-reproducible.",
"## LIMITATIONS\nWhile our filters are meant to ensure a high quality for each data sample in the dataset, we cannot ensure that only low-quality examples were removed. Similarly, we cannot guarantee that our extensive filtering methods catch all low-quality examples. Some might remain in the dataset. Another limitation of our dataset is the low number of programming languages (there are many more) as well as our focus on English commit messages.",
"## METADATA\n- License: Dataset under the CC BY-NC 4.0 license, code under the MIT license",
"## DISCLOSURES AND ETHICAL REVIEW\nWhile we put substantial effort into removing privacy-sensitive information, our solutions cannot find 100% of such cases. This means that researchers and anyone using the data need to incorporate their own safeguards to effectively reduce the amount of personal information that can be exposed.",
"## ABOUT THIS DOCUMENT\nA data statement is a characterization of a dataset that provides context to allow developers and users to better understand how experimental results might generalize, how software might be appropriately deployed, and what biases might be reflected in systems built on the software.\n\nThis data statement was written based on the template for the Data Statements Version 2 schema. The template was prepared by Angelina McMillan-Major, Emily M. Bender, and Batya Friedman and can be found at URL and was updated from the community Version 1 Markdown template by Leon Derczynski."
] |
74ca1d90c693c4f0b28afbe30a84f158b6ed93f1 |
# Dataset of fuma_mishandra (Touhou)
This is the dataset of fuma_mishandra (Touhou), containing 13 images and their tags.
The core tags of this character are `green_eyes, green_hair, long_hair, ribbon, hat, bangs, bow, very_long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 13 | 17.37 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fuma_mishandra_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 13 | 10.00 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fuma_mishandra_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 29 | 19.40 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fuma_mishandra_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 13 | 15.12 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fuma_mishandra_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 29 | 25.70 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fuma_mishandra_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/fuma_mishandra_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 13 |  |  |  |  |  | 1girl, solo, looking_at_viewer, skirt, book, juliet_sleeves, dress, black_pantyhose, blush, closed_mouth, full_body, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | skirt | book | juliet_sleeves | dress | black_pantyhose | blush | closed_mouth | full_body | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:--------|:-------|:-----------------|:--------|:------------------|:--------|:---------------|:------------|:-------------------|
| 0 | 13 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/fuma_mishandra_touhou | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-15T11:17:26+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-15T11:20:10+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of fuma\_mishandra (Touhou)
===================================
This is the dataset of fuma\_mishandra (Touhou), containing 13 images and their tags.
The core tags of this character are 'green\_eyes, green\_hair, long\_hair, ribbon, hat, bangs, bow, very\_long\_hair', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
a435e4df80ff06a2f8e6a293a953dcf53c4a60de | # Dataset Card for "code-instructions-ita-dpo-small"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | mii-llm/code-ita-dpo-small | [
"region:us"
] | 2024-01-15T11:20:11+00:00 | {"dataset_info": {"features": [{"name": "input", "dtype": "string"}, {"name": "chosen", "dtype": "string"}, {"name": "rejected", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 2280369, "num_examples": 609}], "download_size": 1090146, "dataset_size": 2280369}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-15T23:39:40+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "code-instructions-ita-dpo-small"
More Information needed | [
"# Dataset Card for \"code-instructions-ita-dpo-small\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"code-instructions-ita-dpo-small\"\n\nMore Information needed"
] |
381ab668026a3be5c2ec9d5ad1bf8b5a88ee5c3a |
# Dataset Card for Evaluation run of alnrg2arg/test2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [alnrg2arg/test2](https://huggingface.co/alnrg2arg/test2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_alnrg2arg__test2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-15T11:22:11.663514](https://huggingface.co/datasets/open-llm-leaderboard/details_alnrg2arg__test2/blob/main/results_2024-01-15T11-22-11.663514.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.24636436924076846,
"acc_stderr": 0.03057531615216942,
"acc_norm": 0.24707158644894944,
"acc_norm_stderr": 0.031385477138922584,
"mc1": 0.2460220318237454,
"mc1_stderr": 0.015077219200662571,
"mc2": 0.5013831681930769,
"mc2_stderr": 0.017248638043307455
},
"harness|arc:challenge|25": {
"acc": 0.23293515358361774,
"acc_stderr": 0.012352507042617408,
"acc_norm": 0.2721843003412969,
"acc_norm_stderr": 0.013006600406423706
},
"harness|hellaswag|10": {
"acc": 0.2539334793865764,
"acc_stderr": 0.0043437045123801,
"acc_norm": 0.26249751045608444,
"acc_norm_stderr": 0.004390923353200561
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.17,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.17,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2,
"acc_stderr": 0.03455473702325437,
"acc_norm": 0.2,
"acc_norm_stderr": 0.03455473702325437
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.2236842105263158,
"acc_stderr": 0.03391160934343604,
"acc_norm": 0.2236842105263158,
"acc_norm_stderr": 0.03391160934343604
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2679245283018868,
"acc_stderr": 0.027257260322494845,
"acc_norm": 0.2679245283018868,
"acc_norm_stderr": 0.027257260322494845
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.24305555555555555,
"acc_stderr": 0.03586879280080341,
"acc_norm": 0.24305555555555555,
"acc_norm_stderr": 0.03586879280080341
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2658959537572254,
"acc_stderr": 0.0336876293225943,
"acc_norm": 0.2658959537572254,
"acc_norm_stderr": 0.0336876293225943
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2723404255319149,
"acc_stderr": 0.029101290698386708,
"acc_norm": 0.2723404255319149,
"acc_norm_stderr": 0.029101290698386708
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.040969851398436716,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.040969851398436716
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2827586206896552,
"acc_stderr": 0.03752833958003336,
"acc_norm": 0.2827586206896552,
"acc_norm_stderr": 0.03752833958003336
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.23544973544973544,
"acc_stderr": 0.021851509822031715,
"acc_norm": 0.23544973544973544,
"acc_norm_stderr": 0.021851509822031715
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.04163453031302859,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.04163453031302859
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.24516129032258063,
"acc_stderr": 0.024472243840895518,
"acc_norm": 0.24516129032258063,
"acc_norm_stderr": 0.024472243840895518
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.26108374384236455,
"acc_stderr": 0.03090379695211449,
"acc_norm": 0.26108374384236455,
"acc_norm_stderr": 0.03090379695211449
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.17,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.17,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.20606060606060606,
"acc_stderr": 0.03158415324047709,
"acc_norm": 0.20606060606060606,
"acc_norm_stderr": 0.03158415324047709
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.29292929292929293,
"acc_stderr": 0.03242497958178817,
"acc_norm": 0.29292929292929293,
"acc_norm_stderr": 0.03242497958178817
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.2694300518134715,
"acc_stderr": 0.03201867122877794,
"acc_norm": 0.2694300518134715,
"acc_norm_stderr": 0.03201867122877794
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2564102564102564,
"acc_stderr": 0.02213908110397153,
"acc_norm": 0.2564102564102564,
"acc_norm_stderr": 0.02213908110397153
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.0260671592222758,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.0260671592222758
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.24369747899159663,
"acc_stderr": 0.027886828078380558,
"acc_norm": 0.24369747899159663,
"acc_norm_stderr": 0.027886828078380558
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2847682119205298,
"acc_stderr": 0.03684881521389024,
"acc_norm": 0.2847682119205298,
"acc_norm_stderr": 0.03684881521389024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.22935779816513763,
"acc_stderr": 0.018025349724618684,
"acc_norm": 0.22935779816513763,
"acc_norm_stderr": 0.018025349724618684
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.24537037037037038,
"acc_stderr": 0.02934666509437295,
"acc_norm": 0.24537037037037038,
"acc_norm_stderr": 0.02934666509437295
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.24019607843137256,
"acc_stderr": 0.02998373305591361,
"acc_norm": 0.24019607843137256,
"acc_norm_stderr": 0.02998373305591361
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.25738396624472576,
"acc_stderr": 0.028458820991460295,
"acc_norm": 0.25738396624472576,
"acc_norm_stderr": 0.028458820991460295
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.2242152466367713,
"acc_stderr": 0.027991534258519527,
"acc_norm": 0.2242152466367713,
"acc_norm_stderr": 0.027991534258519527
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.25190839694656486,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.25190839694656486,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2231404958677686,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.2231404958677686,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.04284467968052191,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.04284467968052191
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2822085889570552,
"acc_stderr": 0.03536117886664743,
"acc_norm": 0.2822085889570552,
"acc_norm_stderr": 0.03536117886664743
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.26785714285714285,
"acc_stderr": 0.04203277291467764,
"acc_norm": 0.26785714285714285,
"acc_norm_stderr": 0.04203277291467764
},
"harness|hendrycksTest-management|5": {
"acc": 0.23300970873786409,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.23300970873786409,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.21794871794871795,
"acc_stderr": 0.027046857630716667,
"acc_norm": 0.21794871794871795,
"acc_norm_stderr": 0.027046857630716667
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.24648786717752236,
"acc_stderr": 0.015411308769686938,
"acc_norm": 0.24648786717752236,
"acc_norm_stderr": 0.015411308769686938
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24277456647398843,
"acc_stderr": 0.0230836585869842,
"acc_norm": 0.24277456647398843,
"acc_norm_stderr": 0.0230836585869842
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2558659217877095,
"acc_stderr": 0.014593620923210746,
"acc_norm": 0.2558659217877095,
"acc_norm_stderr": 0.014593620923210746
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351277,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351277
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2861736334405145,
"acc_stderr": 0.025670259242188933,
"acc_norm": 0.2861736334405145,
"acc_norm_stderr": 0.025670259242188933
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.24691358024691357,
"acc_stderr": 0.02399350170904212,
"acc_norm": 0.24691358024691357,
"acc_norm_stderr": 0.02399350170904212
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2624113475177305,
"acc_stderr": 0.026244920349843003,
"acc_norm": 0.2624113475177305,
"acc_norm_stderr": 0.026244920349843003
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24967405475880053,
"acc_stderr": 0.011054538377832322,
"acc_norm": 0.24967405475880053,
"acc_norm_stderr": 0.011054538377832322
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.21323529411764705,
"acc_stderr": 0.02488097151229426,
"acc_norm": 0.21323529411764705,
"acc_norm_stderr": 0.02488097151229426
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25326797385620914,
"acc_stderr": 0.01759348689536683,
"acc_norm": 0.25326797385620914,
"acc_norm_stderr": 0.01759348689536683
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2727272727272727,
"acc_stderr": 0.04265792110940588,
"acc_norm": 0.2727272727272727,
"acc_norm_stderr": 0.04265792110940588
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.22040816326530613,
"acc_stderr": 0.026537045312145284,
"acc_norm": 0.22040816326530613,
"acc_norm_stderr": 0.026537045312145284
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.2537313432835821,
"acc_stderr": 0.030769444967296018,
"acc_norm": 0.2537313432835821,
"acc_norm_stderr": 0.030769444967296018
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-virology|5": {
"acc": 0.25301204819277107,
"acc_stderr": 0.033844291552331346,
"acc_norm": 0.25301204819277107,
"acc_norm_stderr": 0.033844291552331346
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.031267817146631786,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.031267817146631786
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2460220318237454,
"mc1_stderr": 0.015077219200662571,
"mc2": 0.5013831681930769,
"mc2_stderr": 0.017248638043307455
},
"harness|winogrande|5": {
"acc": 0.4988161010260458,
"acc_stderr": 0.014052446290529022
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_alnrg2arg__test2 | [
"region:us"
] | 2024-01-15T11:24:30+00:00 | {"pretty_name": "Evaluation run of alnrg2arg/test2", "dataset_summary": "Dataset automatically created during the evaluation run of model [alnrg2arg/test2](https://huggingface.co/alnrg2arg/test2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_alnrg2arg__test2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-15T11:22:11.663514](https://huggingface.co/datasets/open-llm-leaderboard/details_alnrg2arg__test2/blob/main/results_2024-01-15T11-22-11.663514.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.24636436924076846,\n \"acc_stderr\": 0.03057531615216942,\n \"acc_norm\": 0.24707158644894944,\n \"acc_norm_stderr\": 0.031385477138922584,\n \"mc1\": 0.2460220318237454,\n \"mc1_stderr\": 0.015077219200662571,\n \"mc2\": 0.5013831681930769,\n \"mc2_stderr\": 0.017248638043307455\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.23293515358361774,\n \"acc_stderr\": 0.012352507042617408,\n \"acc_norm\": 0.2721843003412969,\n \"acc_norm_stderr\": 0.013006600406423706\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2539334793865764,\n \"acc_stderr\": 0.0043437045123801,\n \"acc_norm\": 0.26249751045608444,\n \"acc_norm_stderr\": 0.004390923353200561\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.17,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.17,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.03455473702325437,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.03455473702325437\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.2236842105263158,\n \"acc_stderr\": 0.03391160934343604,\n \"acc_norm\": 0.2236842105263158,\n \"acc_norm_stderr\": 0.03391160934343604\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.2679245283018868,\n \"acc_stderr\": 0.027257260322494845,\n \"acc_norm\": 0.2679245283018868,\n \"acc_norm_stderr\": 0.027257260322494845\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.24305555555555555,\n \"acc_stderr\": 0.03586879280080341,\n \"acc_norm\": 0.24305555555555555,\n \"acc_norm_stderr\": 0.03586879280080341\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2658959537572254,\n \"acc_stderr\": 0.0336876293225943,\n \"acc_norm\": 0.2658959537572254,\n \"acc_norm_stderr\": 0.0336876293225943\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.2723404255319149,\n \"acc_stderr\": 0.029101290698386708,\n \"acc_norm\": 0.2723404255319149,\n \"acc_norm_stderr\": 0.029101290698386708\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n \"acc_stderr\": 0.040969851398436716,\n \"acc_norm\": 0.2543859649122807,\n \"acc_norm_stderr\": 0.040969851398436716\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2827586206896552,\n \"acc_stderr\": 0.03752833958003336,\n \"acc_norm\": 0.2827586206896552,\n \"acc_norm_stderr\": 0.03752833958003336\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.23544973544973544,\n \"acc_stderr\": 0.021851509822031715,\n \"acc_norm\": 0.23544973544973544,\n \"acc_norm_stderr\": 0.021851509822031715\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.31746031746031744,\n \"acc_stderr\": 0.04163453031302859,\n \"acc_norm\": 0.31746031746031744,\n \"acc_norm_stderr\": 0.04163453031302859\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.24516129032258063,\n \"acc_stderr\": 0.024472243840895518,\n \"acc_norm\": 0.24516129032258063,\n \"acc_norm_stderr\": 0.024472243840895518\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.26108374384236455,\n \"acc_stderr\": 0.03090379695211449,\n \"acc_norm\": 0.26108374384236455,\n \"acc_norm_stderr\": 0.03090379695211449\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.17,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.17,\n \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.20606060606060606,\n \"acc_stderr\": 0.03158415324047709,\n \"acc_norm\": 0.20606060606060606,\n \"acc_norm_stderr\": 0.03158415324047709\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.29292929292929293,\n \"acc_stderr\": 0.03242497958178817,\n \"acc_norm\": 0.29292929292929293,\n \"acc_norm_stderr\": 0.03242497958178817\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.2694300518134715,\n \"acc_stderr\": 0.03201867122877794,\n \"acc_norm\": 0.2694300518134715,\n \"acc_norm_stderr\": 0.03201867122877794\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.2564102564102564,\n \"acc_stderr\": 0.02213908110397153,\n \"acc_norm\": 0.2564102564102564,\n \"acc_norm_stderr\": 0.02213908110397153\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.24074074074074073,\n \"acc_stderr\": 0.0260671592222758,\n \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.0260671592222758\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.24369747899159663,\n \"acc_stderr\": 0.027886828078380558,\n \"acc_norm\": 0.24369747899159663,\n \"acc_norm_stderr\": 0.027886828078380558\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389024,\n \"acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389024\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.22935779816513763,\n \"acc_stderr\": 0.018025349724618684,\n \"acc_norm\": 0.22935779816513763,\n \"acc_norm_stderr\": 0.018025349724618684\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.24537037037037038,\n \"acc_stderr\": 0.02934666509437295,\n \"acc_norm\": 0.24537037037037038,\n \"acc_norm_stderr\": 0.02934666509437295\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.24019607843137256,\n \"acc_stderr\": 0.02998373305591361,\n \"acc_norm\": 0.24019607843137256,\n \"acc_norm_stderr\": 0.02998373305591361\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.25738396624472576,\n \"acc_stderr\": 0.028458820991460295,\n \"acc_norm\": 0.25738396624472576,\n \"acc_norm_stderr\": 0.028458820991460295\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.2242152466367713,\n \"acc_stderr\": 0.027991534258519527,\n \"acc_norm\": 0.2242152466367713,\n \"acc_norm_stderr\": 0.027991534258519527\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.25190839694656486,\n \"acc_stderr\": 0.03807387116306086,\n \"acc_norm\": 0.25190839694656486,\n \"acc_norm_stderr\": 0.03807387116306086\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2231404958677686,\n \"acc_stderr\": 0.03800754475228733,\n \"acc_norm\": 0.2231404958677686,\n \"acc_norm_stderr\": 0.03800754475228733\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.26851851851851855,\n \"acc_stderr\": 0.04284467968052191,\n \"acc_norm\": 0.26851851851851855,\n \"acc_norm_stderr\": 0.04284467968052191\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.2822085889570552,\n \"acc_stderr\": 0.03536117886664743,\n \"acc_norm\": 0.2822085889570552,\n \"acc_norm_stderr\": 0.03536117886664743\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.26785714285714285,\n \"acc_stderr\": 0.04203277291467764,\n \"acc_norm\": 0.26785714285714285,\n \"acc_norm_stderr\": 0.04203277291467764\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.23300970873786409,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.23300970873786409,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.21794871794871795,\n \"acc_stderr\": 0.027046857630716667,\n \"acc_norm\": 0.21794871794871795,\n \"acc_norm_stderr\": 0.027046857630716667\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.24648786717752236,\n \"acc_stderr\": 0.015411308769686938,\n \"acc_norm\": 0.24648786717752236,\n \"acc_norm_stderr\": 0.015411308769686938\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.24277456647398843,\n \"acc_stderr\": 0.0230836585869842,\n \"acc_norm\": 0.24277456647398843,\n \"acc_norm_stderr\": 0.0230836585869842\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2558659217877095,\n \"acc_stderr\": 0.014593620923210746,\n \"acc_norm\": 0.2558659217877095,\n \"acc_norm_stderr\": 0.014593620923210746\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351277,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351277\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2861736334405145,\n \"acc_stderr\": 0.025670259242188933,\n \"acc_norm\": 0.2861736334405145,\n \"acc_norm_stderr\": 0.025670259242188933\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.24691358024691357,\n \"acc_stderr\": 0.02399350170904212,\n \"acc_norm\": 0.24691358024691357,\n \"acc_norm_stderr\": 0.02399350170904212\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.2624113475177305,\n \"acc_stderr\": 0.026244920349843003,\n \"acc_norm\": 0.2624113475177305,\n \"acc_norm_stderr\": 0.026244920349843003\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24967405475880053,\n \"acc_stderr\": 0.011054538377832322,\n \"acc_norm\": 0.24967405475880053,\n \"acc_norm_stderr\": 0.011054538377832322\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.21323529411764705,\n \"acc_stderr\": 0.02488097151229426,\n \"acc_norm\": 0.21323529411764705,\n \"acc_norm_stderr\": 0.02488097151229426\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.25326797385620914,\n \"acc_stderr\": 0.01759348689536683,\n \"acc_norm\": 0.25326797385620914,\n \"acc_norm_stderr\": 0.01759348689536683\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2727272727272727,\n \"acc_stderr\": 0.04265792110940588,\n \"acc_norm\": 0.2727272727272727,\n \"acc_norm_stderr\": 0.04265792110940588\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.22040816326530613,\n \"acc_stderr\": 0.026537045312145284,\n \"acc_norm\": 0.22040816326530613,\n \"acc_norm_stderr\": 0.026537045312145284\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.2537313432835821,\n \"acc_stderr\": 0.030769444967296018,\n \"acc_norm\": 0.2537313432835821,\n \"acc_norm_stderr\": 0.030769444967296018\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.25301204819277107,\n \"acc_stderr\": 0.033844291552331346,\n \"acc_norm\": 0.25301204819277107,\n \"acc_norm_stderr\": 0.033844291552331346\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.031267817146631786,\n \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.031267817146631786\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2460220318237454,\n \"mc1_stderr\": 0.015077219200662571,\n \"mc2\": 0.5013831681930769,\n \"mc2_stderr\": 0.017248638043307455\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.4988161010260458,\n \"acc_stderr\": 0.014052446290529022\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/alnrg2arg/test2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_15T11_22_11.663514", "path": ["**/details_harness|arc:challenge|25_2024-01-15T11-22-11.663514.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-15T11-22-11.663514.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_15T11_22_11.663514", "path": ["**/details_harness|gsm8k|5_2024-01-15T11-22-11.663514.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-15T11-22-11.663514.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_15T11_22_11.663514", "path": ["**/details_harness|hellaswag|10_2024-01-15T11-22-11.663514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-15T11-22-11.663514.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_15T11_22_11.663514", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-15T11-22-11.663514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-15T11-22-11.663514.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-15T11-22-11.663514.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_15T11_22_11.663514", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T11-22-11.663514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T11-22-11.663514.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_15T11_22_11.663514", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-15T11-22-11.663514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-15T11-22-11.663514.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_15T11_22_11.663514", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-15T11-22-11.663514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-15T11-22-11.663514.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_15T11_22_11.663514", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T11-22-11.663514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T11-22-11.663514.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_15T11_22_11.663514", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T11-22-11.663514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T11-22-11.663514.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_15T11_22_11.663514", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-15T11-22-11.663514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-15T11-22-11.663514.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_15T11_22_11.663514", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T11-22-11.663514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T11-22-11.663514.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_15T11_22_11.663514", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T11-22-11.663514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T11-22-11.663514.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_15T11_22_11.663514", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T11-22-11.663514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T11-22-11.663514.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_15T11_22_11.663514", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T11-22-11.663514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T11-22-11.663514.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_15T11_22_11.663514", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-15T11-22-11.663514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-15T11-22-11.663514.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_15T11_22_11.663514", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-15T11-22-11.663514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-15T11-22-11.663514.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_15T11_22_11.663514", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T11-22-11.663514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T11-22-11.663514.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_15T11_22_11.663514", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-15T11-22-11.663514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-15T11-22-11.663514.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_15T11_22_11.663514", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T11-22-11.663514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T11-22-11.663514.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_15T11_22_11.663514", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T11-22-11.663514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T11-22-11.663514.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_15T11_22_11.663514", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T11-22-11.663514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T11-22-11.663514.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_15T11_22_11.663514", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-15T11-22-11.663514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-15T11-22-11.663514.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_15T11_22_11.663514", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T11-22-11.663514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T11-22-11.663514.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_15T11_22_11.663514", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T11-22-11.663514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T11-22-11.663514.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_15T11_22_11.663514", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T11-22-11.663514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T11-22-11.663514.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_15T11_22_11.663514", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T11-22-11.663514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T11-22-11.663514.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_15T11_22_11.663514", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T11-22-11.663514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T11-22-11.663514.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_15T11_22_11.663514", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T11-22-11.663514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T11-22-11.663514.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_15T11_22_11.663514", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T11-22-11.663514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T11-22-11.663514.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_15T11_22_11.663514", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T11-22-11.663514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T11-22-11.663514.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_15T11_22_11.663514", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T11-22-11.663514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T11-22-11.663514.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_15T11_22_11.663514", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T11-22-11.663514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T11-22-11.663514.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_15T11_22_11.663514", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T11-22-11.663514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T11-22-11.663514.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_15T11_22_11.663514", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T11-22-11.663514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T11-22-11.663514.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_15T11_22_11.663514", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T11-22-11.663514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T11-22-11.663514.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_15T11_22_11.663514", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T11-22-11.663514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T11-22-11.663514.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_15T11_22_11.663514", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-15T11-22-11.663514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-15T11-22-11.663514.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_15T11_22_11.663514", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T11-22-11.663514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T11-22-11.663514.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_15T11_22_11.663514", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-15T11-22-11.663514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-15T11-22-11.663514.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_15T11_22_11.663514", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T11-22-11.663514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T11-22-11.663514.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_15T11_22_11.663514", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T11-22-11.663514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T11-22-11.663514.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_15T11_22_11.663514", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T11-22-11.663514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T11-22-11.663514.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_15T11_22_11.663514", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-15T11-22-11.663514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-15T11-22-11.663514.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_15T11_22_11.663514", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-15T11-22-11.663514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-15T11-22-11.663514.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_15T11_22_11.663514", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T11-22-11.663514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T11-22-11.663514.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_15T11_22_11.663514", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T11-22-11.663514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T11-22-11.663514.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_15T11_22_11.663514", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T11-22-11.663514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T11-22-11.663514.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_15T11_22_11.663514", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T11-22-11.663514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T11-22-11.663514.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_15T11_22_11.663514", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-15T11-22-11.663514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-15T11-22-11.663514.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_15T11_22_11.663514", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-15T11-22-11.663514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-15T11-22-11.663514.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_15T11_22_11.663514", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-15T11-22-11.663514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-15T11-22-11.663514.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_15T11_22_11.663514", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T11-22-11.663514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T11-22-11.663514.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_15T11_22_11.663514", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-15T11-22-11.663514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-15T11-22-11.663514.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_15T11_22_11.663514", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T11-22-11.663514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T11-22-11.663514.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_15T11_22_11.663514", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T11-22-11.663514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T11-22-11.663514.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_15T11_22_11.663514", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-15T11-22-11.663514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-15T11-22-11.663514.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_15T11_22_11.663514", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-15T11-22-11.663514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-15T11-22-11.663514.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_15T11_22_11.663514", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-15T11-22-11.663514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-15T11-22-11.663514.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_15T11_22_11.663514", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T11-22-11.663514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T11-22-11.663514.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_15T11_22_11.663514", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-15T11-22-11.663514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-15T11-22-11.663514.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_15T11_22_11.663514", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-15T11-22-11.663514.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-15T11-22-11.663514.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_15T11_22_11.663514", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-15T11-22-11.663514.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-15T11-22-11.663514.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_15T11_22_11.663514", "path": ["**/details_harness|winogrande|5_2024-01-15T11-22-11.663514.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-15T11-22-11.663514.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_15T11_22_11.663514", "path": ["results_2024-01-15T11-22-11.663514.parquet"]}, {"split": "latest", "path": ["results_2024-01-15T11-22-11.663514.parquet"]}]}]} | 2024-01-15T11:24:51+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of alnrg2arg/test2
Dataset automatically created during the evaluation run of model alnrg2arg/test2 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-15T11:22:11.663514(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of alnrg2arg/test2\n\n\n\nDataset automatically created during the evaluation run of model alnrg2arg/test2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-15T11:22:11.663514(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of alnrg2arg/test2\n\n\n\nDataset automatically created during the evaluation run of model alnrg2arg/test2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-15T11:22:11.663514(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
9a0ea0accfba278d1174c2cf61cd47b0c4771f46 |
# Dataset Card for Evaluation run of naseerfaheem/SOLAR-10.7B-Instruct-ties
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [naseerfaheem/SOLAR-10.7B-Instruct-ties](https://huggingface.co/naseerfaheem/SOLAR-10.7B-Instruct-ties) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_naseerfaheem__SOLAR-10.7B-Instruct-ties",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-15T11:26:12.738490](https://huggingface.co/datasets/open-llm-leaderboard/details_naseerfaheem__SOLAR-10.7B-Instruct-ties/blob/main/results_2024-01-15T11-26-12.738490.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.667016123553577,
"acc_stderr": 0.03159957823768967,
"acc_norm": 0.667926905859475,
"acc_norm_stderr": 0.03224206176505658,
"mc1": 0.5703794369645043,
"mc1_stderr": 0.017329234580409095,
"mc2": 0.7187641653348525,
"mc2_stderr": 0.01496712981786399
},
"harness|arc:challenge|25": {
"acc": 0.6834470989761092,
"acc_stderr": 0.013592431519068079,
"acc_norm": 0.7090443686006825,
"acc_norm_stderr": 0.013273077865907593
},
"harness|hellaswag|10": {
"acc": 0.7159928301135232,
"acc_stderr": 0.004500186424443792,
"acc_norm": 0.8857797251543518,
"acc_norm_stderr": 0.0031742854949621665
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7368421052631579,
"acc_stderr": 0.03583496176361072,
"acc_norm": 0.7368421052631579,
"acc_norm_stderr": 0.03583496176361072
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.028450154794118637,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.028450154794118637
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266346,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266346
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6212765957446809,
"acc_stderr": 0.03170995606040655,
"acc_norm": 0.6212765957446809,
"acc_norm_stderr": 0.03170995606040655
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6275862068965518,
"acc_stderr": 0.04028731532947558,
"acc_norm": 0.6275862068965518,
"acc_norm_stderr": 0.04028731532947558
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.49206349206349204,
"acc_stderr": 0.02574806587167328,
"acc_norm": 0.49206349206349204,
"acc_norm_stderr": 0.02574806587167328
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.044444444444444495,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.044444444444444495
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8129032258064516,
"acc_stderr": 0.022185710092252255,
"acc_norm": 0.8129032258064516,
"acc_norm_stderr": 0.022185710092252255
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8121212121212121,
"acc_stderr": 0.03050193405942914,
"acc_norm": 0.8121212121212121,
"acc_norm_stderr": 0.03050193405942914
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8686868686868687,
"acc_stderr": 0.024063156416822516,
"acc_norm": 0.8686868686868687,
"acc_norm_stderr": 0.024063156416822516
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.02150024957603348,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.02150024957603348
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.023946724741563976,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.023946724741563976
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.362962962962963,
"acc_stderr": 0.02931820364520686,
"acc_norm": 0.362962962962963,
"acc_norm_stderr": 0.02931820364520686
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.029344572500634332,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.029344572500634332
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242741,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242741
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8513761467889909,
"acc_stderr": 0.015251253773660834,
"acc_norm": 0.8513761467889909,
"acc_norm_stderr": 0.015251253773660834
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5740740740740741,
"acc_stderr": 0.03372343271653062,
"acc_norm": 0.5740740740740741,
"acc_norm_stderr": 0.03372343271653062
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8529411764705882,
"acc_stderr": 0.02485747808025046,
"acc_norm": 0.8529411764705882,
"acc_norm_stderr": 0.02485747808025046
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8523206751054853,
"acc_stderr": 0.0230943295825957,
"acc_norm": 0.8523206751054853,
"acc_norm_stderr": 0.0230943295825957
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.03138147637575499,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.03138147637575499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.03749492448709696,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.03749492448709696
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037182,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037182
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.04726835553719099,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.04726835553719099
},
"harness|hendrycksTest-management|5": {
"acc": 0.8543689320388349,
"acc_stderr": 0.03492606476623791,
"acc_norm": 0.8543689320388349,
"acc_norm_stderr": 0.03492606476623791
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.023086635086841407,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.023086635086841407
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8109833971902938,
"acc_stderr": 0.014000791294406999,
"acc_norm": 0.8109833971902938,
"acc_norm_stderr": 0.014000791294406999
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7543352601156069,
"acc_stderr": 0.023176298203992005,
"acc_norm": 0.7543352601156069,
"acc_norm_stderr": 0.023176298203992005
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.40782122905027934,
"acc_stderr": 0.016435865260914746,
"acc_norm": 0.40782122905027934,
"acc_norm_stderr": 0.016435865260914746
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.02463004897982478,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.02463004897982478
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7331189710610932,
"acc_stderr": 0.025122637608816643,
"acc_norm": 0.7331189710610932,
"acc_norm_stderr": 0.025122637608816643
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7746913580246914,
"acc_stderr": 0.023246202647819753,
"acc_norm": 0.7746913580246914,
"acc_norm_stderr": 0.023246202647819753
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5070921985815603,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.5070921985815603,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.49608865710560623,
"acc_stderr": 0.012769845366441192,
"acc_norm": 0.49608865710560623,
"acc_norm_stderr": 0.012769845366441192
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.75,
"acc_stderr": 0.026303648393696036,
"acc_norm": 0.75,
"acc_norm_stderr": 0.026303648393696036
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6797385620915033,
"acc_stderr": 0.018875682938069446,
"acc_norm": 0.6797385620915033,
"acc_norm_stderr": 0.018875682938069446
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784593,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784593
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.92,
"acc_stderr": 0.0272659924344291,
"acc_norm": 0.92,
"acc_norm_stderr": 0.0272659924344291
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5903614457831325,
"acc_stderr": 0.038284011150790206,
"acc_norm": 0.5903614457831325,
"acc_norm_stderr": 0.038284011150790206
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.783625730994152,
"acc_stderr": 0.03158149539338734,
"acc_norm": 0.783625730994152,
"acc_norm_stderr": 0.03158149539338734
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5703794369645043,
"mc1_stderr": 0.017329234580409095,
"mc2": 0.7187641653348525,
"mc2_stderr": 0.01496712981786399
},
"harness|winogrande|5": {
"acc": 0.835043409629045,
"acc_stderr": 0.01043091746823743
},
"harness|gsm8k|5": {
"acc": 0.640636846095527,
"acc_stderr": 0.01321645630985153
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_naseerfaheem__SOLAR-10.7B-Instruct-ties | [
"region:us"
] | 2024-01-15T11:28:28+00:00 | {"pretty_name": "Evaluation run of naseerfaheem/SOLAR-10.7B-Instruct-ties", "dataset_summary": "Dataset automatically created during the evaluation run of model [naseerfaheem/SOLAR-10.7B-Instruct-ties](https://huggingface.co/naseerfaheem/SOLAR-10.7B-Instruct-ties) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_naseerfaheem__SOLAR-10.7B-Instruct-ties\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-15T11:26:12.738490](https://huggingface.co/datasets/open-llm-leaderboard/details_naseerfaheem__SOLAR-10.7B-Instruct-ties/blob/main/results_2024-01-15T11-26-12.738490.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.667016123553577,\n \"acc_stderr\": 0.03159957823768967,\n \"acc_norm\": 0.667926905859475,\n \"acc_norm_stderr\": 0.03224206176505658,\n \"mc1\": 0.5703794369645043,\n \"mc1_stderr\": 0.017329234580409095,\n \"mc2\": 0.7187641653348525,\n \"mc2_stderr\": 0.01496712981786399\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6834470989761092,\n \"acc_stderr\": 0.013592431519068079,\n \"acc_norm\": 0.7090443686006825,\n \"acc_norm_stderr\": 0.013273077865907593\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7159928301135232,\n \"acc_stderr\": 0.004500186424443792,\n \"acc_norm\": 0.8857797251543518,\n \"acc_norm_stderr\": 0.0031742854949621665\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7368421052631579,\n \"acc_stderr\": 0.03583496176361072,\n \"acc_norm\": 0.7368421052631579,\n \"acc_norm_stderr\": 0.03583496176361072\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.028450154794118637,\n \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.028450154794118637\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266346,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266346\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6212765957446809,\n \"acc_stderr\": 0.03170995606040655,\n \"acc_norm\": 0.6212765957446809,\n \"acc_norm_stderr\": 0.03170995606040655\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6275862068965518,\n \"acc_stderr\": 0.04028731532947558,\n \"acc_norm\": 0.6275862068965518,\n \"acc_norm_stderr\": 0.04028731532947558\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.49206349206349204,\n \"acc_stderr\": 0.02574806587167328,\n \"acc_norm\": 0.49206349206349204,\n \"acc_norm_stderr\": 0.02574806587167328\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.044444444444444495,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.044444444444444495\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8129032258064516,\n \"acc_stderr\": 0.022185710092252255,\n \"acc_norm\": 0.8129032258064516,\n \"acc_norm_stderr\": 0.022185710092252255\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8121212121212121,\n \"acc_stderr\": 0.03050193405942914,\n \"acc_norm\": 0.8121212121212121,\n \"acc_norm_stderr\": 0.03050193405942914\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8686868686868687,\n \"acc_stderr\": 0.024063156416822516,\n \"acc_norm\": 0.8686868686868687,\n \"acc_norm_stderr\": 0.024063156416822516\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.02150024957603348,\n \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.02150024957603348\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563976,\n \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563976\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.362962962962963,\n \"acc_stderr\": 0.02931820364520686,\n \"acc_norm\": 0.362962962962963,\n \"acc_norm_stderr\": 0.02931820364520686\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.029344572500634332,\n \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.029344572500634332\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242741,\n \"acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242741\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8513761467889909,\n \"acc_stderr\": 0.015251253773660834,\n \"acc_norm\": 0.8513761467889909,\n \"acc_norm_stderr\": 0.015251253773660834\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5740740740740741,\n \"acc_stderr\": 0.03372343271653062,\n \"acc_norm\": 0.5740740740740741,\n \"acc_norm_stderr\": 0.03372343271653062\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8529411764705882,\n \"acc_stderr\": 0.02485747808025046,\n \"acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.02485747808025046\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8523206751054853,\n \"acc_stderr\": 0.0230943295825957,\n \"acc_norm\": 0.8523206751054853,\n \"acc_norm_stderr\": 0.0230943295825957\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n \"acc_stderr\": 0.03138147637575499,\n \"acc_norm\": 0.6771300448430493,\n \"acc_norm_stderr\": 0.03138147637575499\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306086,\n \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306086\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.03749492448709696,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.03749492448709696\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n \"acc_stderr\": 0.03755265865037182,\n \"acc_norm\": 0.8148148148148148,\n \"acc_norm_stderr\": 0.03755265865037182\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n \"acc_stderr\": 0.04726835553719099,\n \"acc_norm\": 0.45535714285714285,\n \"acc_norm_stderr\": 0.04726835553719099\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8543689320388349,\n \"acc_stderr\": 0.03492606476623791,\n \"acc_norm\": 0.8543689320388349,\n \"acc_norm_stderr\": 0.03492606476623791\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n \"acc_stderr\": 0.023086635086841407,\n \"acc_norm\": 0.8547008547008547,\n \"acc_norm_stderr\": 0.023086635086841407\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8109833971902938,\n \"acc_stderr\": 0.014000791294406999,\n \"acc_norm\": 0.8109833971902938,\n \"acc_norm_stderr\": 0.014000791294406999\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7543352601156069,\n \"acc_stderr\": 0.023176298203992005,\n \"acc_norm\": 0.7543352601156069,\n \"acc_norm_stderr\": 0.023176298203992005\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.40782122905027934,\n \"acc_stderr\": 0.016435865260914746,\n \"acc_norm\": 0.40782122905027934,\n \"acc_norm_stderr\": 0.016435865260914746\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7549019607843137,\n \"acc_stderr\": 0.02463004897982478,\n \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.02463004897982478\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7331189710610932,\n \"acc_stderr\": 0.025122637608816643,\n \"acc_norm\": 0.7331189710610932,\n \"acc_norm_stderr\": 0.025122637608816643\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7746913580246914,\n \"acc_stderr\": 0.023246202647819753,\n \"acc_norm\": 0.7746913580246914,\n \"acc_norm_stderr\": 0.023246202647819753\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5070921985815603,\n \"acc_stderr\": 0.02982449855912901,\n \"acc_norm\": 0.5070921985815603,\n \"acc_norm_stderr\": 0.02982449855912901\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.49608865710560623,\n \"acc_stderr\": 0.012769845366441192,\n \"acc_norm\": 0.49608865710560623,\n \"acc_norm_stderr\": 0.012769845366441192\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.026303648393696036,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.026303648393696036\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6797385620915033,\n \"acc_stderr\": 0.018875682938069446,\n \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.018875682938069446\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.92,\n \"acc_stderr\": 0.0272659924344291,\n \"acc_norm\": 0.92,\n \"acc_norm_stderr\": 0.0272659924344291\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5903614457831325,\n \"acc_stderr\": 0.038284011150790206,\n \"acc_norm\": 0.5903614457831325,\n \"acc_norm_stderr\": 0.038284011150790206\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.783625730994152,\n \"acc_stderr\": 0.03158149539338734,\n \"acc_norm\": 0.783625730994152,\n \"acc_norm_stderr\": 0.03158149539338734\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5703794369645043,\n \"mc1_stderr\": 0.017329234580409095,\n \"mc2\": 0.7187641653348525,\n \"mc2_stderr\": 0.01496712981786399\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.835043409629045,\n \"acc_stderr\": 0.01043091746823743\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.640636846095527,\n \"acc_stderr\": 0.01321645630985153\n }\n}\n```", "repo_url": "https://huggingface.co/naseerfaheem/SOLAR-10.7B-Instruct-ties", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_15T11_26_12.738490", "path": ["**/details_harness|arc:challenge|25_2024-01-15T11-26-12.738490.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-15T11-26-12.738490.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_15T11_26_12.738490", "path": ["**/details_harness|gsm8k|5_2024-01-15T11-26-12.738490.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-15T11-26-12.738490.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_15T11_26_12.738490", "path": ["**/details_harness|hellaswag|10_2024-01-15T11-26-12.738490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-15T11-26-12.738490.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_15T11_26_12.738490", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-15T11-26-12.738490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-15T11-26-12.738490.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-15T11-26-12.738490.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_15T11_26_12.738490", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T11-26-12.738490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T11-26-12.738490.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_15T11_26_12.738490", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-15T11-26-12.738490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-15T11-26-12.738490.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_15T11_26_12.738490", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-15T11-26-12.738490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-15T11-26-12.738490.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_15T11_26_12.738490", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T11-26-12.738490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T11-26-12.738490.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_15T11_26_12.738490", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T11-26-12.738490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T11-26-12.738490.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_15T11_26_12.738490", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-15T11-26-12.738490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-15T11-26-12.738490.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_15T11_26_12.738490", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T11-26-12.738490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T11-26-12.738490.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_15T11_26_12.738490", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T11-26-12.738490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T11-26-12.738490.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_15T11_26_12.738490", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T11-26-12.738490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T11-26-12.738490.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_15T11_26_12.738490", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T11-26-12.738490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T11-26-12.738490.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_15T11_26_12.738490", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-15T11-26-12.738490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-15T11-26-12.738490.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_15T11_26_12.738490", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-15T11-26-12.738490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-15T11-26-12.738490.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_15T11_26_12.738490", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T11-26-12.738490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T11-26-12.738490.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_15T11_26_12.738490", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-15T11-26-12.738490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-15T11-26-12.738490.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_15T11_26_12.738490", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T11-26-12.738490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T11-26-12.738490.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_15T11_26_12.738490", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T11-26-12.738490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T11-26-12.738490.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_15T11_26_12.738490", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T11-26-12.738490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T11-26-12.738490.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_15T11_26_12.738490", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-15T11-26-12.738490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-15T11-26-12.738490.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_15T11_26_12.738490", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T11-26-12.738490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T11-26-12.738490.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_15T11_26_12.738490", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T11-26-12.738490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T11-26-12.738490.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_15T11_26_12.738490", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T11-26-12.738490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T11-26-12.738490.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_15T11_26_12.738490", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T11-26-12.738490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T11-26-12.738490.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_15T11_26_12.738490", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T11-26-12.738490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T11-26-12.738490.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_15T11_26_12.738490", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T11-26-12.738490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T11-26-12.738490.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_15T11_26_12.738490", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T11-26-12.738490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T11-26-12.738490.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_15T11_26_12.738490", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T11-26-12.738490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T11-26-12.738490.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_15T11_26_12.738490", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T11-26-12.738490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T11-26-12.738490.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_15T11_26_12.738490", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T11-26-12.738490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T11-26-12.738490.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_15T11_26_12.738490", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T11-26-12.738490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T11-26-12.738490.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_15T11_26_12.738490", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T11-26-12.738490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T11-26-12.738490.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_15T11_26_12.738490", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T11-26-12.738490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T11-26-12.738490.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_15T11_26_12.738490", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T11-26-12.738490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T11-26-12.738490.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_15T11_26_12.738490", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-15T11-26-12.738490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-15T11-26-12.738490.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_15T11_26_12.738490", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T11-26-12.738490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T11-26-12.738490.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_15T11_26_12.738490", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-15T11-26-12.738490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-15T11-26-12.738490.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_15T11_26_12.738490", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T11-26-12.738490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T11-26-12.738490.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_15T11_26_12.738490", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T11-26-12.738490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T11-26-12.738490.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_15T11_26_12.738490", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T11-26-12.738490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T11-26-12.738490.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_15T11_26_12.738490", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-15T11-26-12.738490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-15T11-26-12.738490.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_15T11_26_12.738490", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-15T11-26-12.738490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-15T11-26-12.738490.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_15T11_26_12.738490", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T11-26-12.738490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T11-26-12.738490.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_15T11_26_12.738490", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T11-26-12.738490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T11-26-12.738490.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_15T11_26_12.738490", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T11-26-12.738490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T11-26-12.738490.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_15T11_26_12.738490", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T11-26-12.738490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T11-26-12.738490.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_15T11_26_12.738490", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-15T11-26-12.738490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-15T11-26-12.738490.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_15T11_26_12.738490", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-15T11-26-12.738490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-15T11-26-12.738490.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_15T11_26_12.738490", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-15T11-26-12.738490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-15T11-26-12.738490.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_15T11_26_12.738490", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T11-26-12.738490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T11-26-12.738490.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_15T11_26_12.738490", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-15T11-26-12.738490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-15T11-26-12.738490.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_15T11_26_12.738490", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T11-26-12.738490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T11-26-12.738490.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_15T11_26_12.738490", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T11-26-12.738490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T11-26-12.738490.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_15T11_26_12.738490", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-15T11-26-12.738490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-15T11-26-12.738490.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_15T11_26_12.738490", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-15T11-26-12.738490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-15T11-26-12.738490.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_15T11_26_12.738490", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-15T11-26-12.738490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-15T11-26-12.738490.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_15T11_26_12.738490", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T11-26-12.738490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T11-26-12.738490.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_15T11_26_12.738490", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-15T11-26-12.738490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-15T11-26-12.738490.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_15T11_26_12.738490", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-15T11-26-12.738490.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-15T11-26-12.738490.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_15T11_26_12.738490", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-15T11-26-12.738490.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-15T11-26-12.738490.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_15T11_26_12.738490", "path": ["**/details_harness|winogrande|5_2024-01-15T11-26-12.738490.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-15T11-26-12.738490.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_15T11_26_12.738490", "path": ["results_2024-01-15T11-26-12.738490.parquet"]}, {"split": "latest", "path": ["results_2024-01-15T11-26-12.738490.parquet"]}]}]} | 2024-01-15T11:28:48+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of naseerfaheem/SOLAR-10.7B-Instruct-ties
Dataset automatically created during the evaluation run of model naseerfaheem/SOLAR-10.7B-Instruct-ties on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-15T11:26:12.738490(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of naseerfaheem/SOLAR-10.7B-Instruct-ties\n\n\n\nDataset automatically created during the evaluation run of model naseerfaheem/SOLAR-10.7B-Instruct-ties on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-15T11:26:12.738490(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of naseerfaheem/SOLAR-10.7B-Instruct-ties\n\n\n\nDataset automatically created during the evaluation run of model naseerfaheem/SOLAR-10.7B-Instruct-ties on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-15T11:26:12.738490(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
1decf9aea4cff64ecf82a33e5672142e3dfd877b | This is a dataset made for the purpose of evaluating Text-to-SQL systems for geography-based applications.
Currently, we have only released 109 examples of natural_language, sql_query pairs.
Steps:
1. First, unzip all the .shp files and load them into your postgres database instance.
2. Load the text,sql pair from the .csv file into your desired program.
3. Generate SQL for the questions using your own LLM and compare the results any way you like.
###REQUIREMENTS.
1. You need postgres SQL installed with postgis extension enabled.
2. You need to have tiger geocoder enabled only for Florida state.
i.e., the geocoding done in this dataset is only on addresses from the Florida state.
For more information on installing tiger geocoder, see the book Postgis in Action by R. Obe, L. Hsu
Chapter 10: PostGIS TIGER geocoder
Copyright
reAlpha Tech Corp, 2024
Made by:
ML Team, Naamche | naamche/geosql-llm-eval | [
"size_categories:n<1K",
"language:en",
"license:mit",
"region:us"
] | 2024-01-15T12:09:42+00:00 | {"language": ["en"], "license": "mit", "size_categories": ["n<1K"]} | 2024-01-15T12:24:06+00:00 | [] | [
"en"
] | TAGS
#size_categories-n<1K #language-English #license-mit #region-us
| This is a dataset made for the purpose of evaluating Text-to-SQL systems for geography-based applications.
Currently, we have only released 109 examples of natural_language, sql_query pairs.
Steps:
1. First, unzip all the .shp files and load them into your postgres database instance.
2. Load the text,sql pair from the .csv file into your desired program.
3. Generate SQL for the questions using your own LLM and compare the results any way you like.
###REQUIREMENTS.
1. You need postgres SQL installed with postgis extension enabled.
2. You need to have tiger geocoder enabled only for Florida state.
i.e., the geocoding done in this dataset is only on addresses from the Florida state.
For more information on installing tiger geocoder, see the book Postgis in Action by R. Obe, L. Hsu
Chapter 10: PostGIS TIGER geocoder
Copyright
reAlpha Tech Corp, 2024
Made by:
ML Team, Naamche | [] | [
"TAGS\n#size_categories-n<1K #language-English #license-mit #region-us \n"
] |
8228a37ca8d3c733f5e559cc99e3be9f8de170f9 |
# Dataset Card for Common Voice Corpus 16
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://commonvoice.mozilla.org/en/datasets
- **Repository:** https://github.com/common-voice/common-voice
- **Paper:** https://arxiv.org/abs/1912.06670
- **Leaderboard:** https://paperswithcode.com/dataset/common-voice
- **Point of Contact:** [Vaibhav Srivastav](mailto:[email protected])
### Dataset Summary
The Common Voice dataset consists of a unique MP3 and corresponding text file.
Many of the 30328 recorded hours in the dataset also include demographic metadata like age, sex, and accent
that can help improve the accuracy of speech recognition engines.
The dataset currently consists of 19673 validated hours in 120 languages, but more voices and languages are always added.
Take a look at the [Languages](https://commonvoice.mozilla.org/en/languages) page to request a language or start contributing.
### Languages
```
Abkhaz, Afrikaans, Albanian, Amharic, Arabic, Armenian, Assamese, Asturian, Azerbaijani, Basaa, Bashkir, Basque, Belarusian, Bengali, Breton, Bulgarian, Cantonese, Catalan, Central Kurdish, Chinese (China), Chinese (Hong Kong), Chinese (Taiwan), Chuvash, Czech, Danish, Dhivehi, Dioula, Dutch, English, Erzya, Esperanto, Estonian, Finnish, French, Frisian, Galician, Georgian, German, Greek, Guarani, Hakha Chin, Hausa, Hebrew, Hill Mari, Hindi, Hungarian, Icelandic, Igbo, Indonesian, Interlingua, Irish, Italian, Japanese, Kabyle, Kazakh, Kinyarwanda, Korean, Kurmanji Kurdish, Kyrgyz, Lao, Latgalian, Latvian, Ligurian, Lithuanian, Luganda, Macedonian, Malayalam, Maltese, Marathi, Meadow Mari, Moksha, Mongolian, Nepali, Norwegian Nynorsk, Occitan, Odia, Ossetian, Pashto, Persian, Polish, Portuguese, Punjabi, Quechua Chanka, Romanian, Romansh Sursilvan, Romansh Vallader, Russian, Sakha, Santali (Ol Chiki), Saraiki, Sardinian, Serbian, Slovak, Slovenian, Sorbian, Upper, Spanish, Swahili, Swedish, Taiwanese (Minnan), Tamazight, Tamil, Tatar, Telugu, Thai, Tigre, Tigrinya, Toki Pona, Turkish, Turkmen, Twi, Ukrainian, Urdu, Uyghur, Uzbek, Vietnamese, Votic, Welsh, Western Sierra Puebla Nahuatl, Yiddish, Yoruba
```
## How to use
The `datasets` library allows you to load and pre-process your dataset in pure Python, at scale. The dataset can be downloaded and prepared in one call to your local drive by using the `load_dataset` function.
For example, to download the Hindi config, simply specify the corresponding language config name (i.e., "hi" for Hindi):
```python
from datasets import load_dataset
cv_16 = load_dataset("mozilla-foundation/common_voice_16_1", "hi", split="train")
```
Using the datasets library, you can also stream the dataset on-the-fly by adding a `streaming=True` argument to the `load_dataset` function call. Loading a dataset in streaming mode loads individual samples of the dataset at a time, rather than downloading the entire dataset to disk.
```python
from datasets import load_dataset
cv_16 = load_dataset("mozilla-foundation/common_voice_16_1", "hi", split="train", streaming=True)
print(next(iter(cv_16)))
```
*Bonus*: create a [PyTorch dataloader](https://huggingface.co/docs/datasets/use_with_pytorch) directly with your own datasets (local/streamed).
### Local
```python
from datasets import load_dataset
from torch.utils.data.sampler import BatchSampler, RandomSampler
cv_16 = load_dataset("mozilla-foundation/common_voice_16_1", "hi", split="train")
batch_sampler = BatchSampler(RandomSampler(cv_16), batch_size=32, drop_last=False)
dataloader = DataLoader(cv_16, batch_sampler=batch_sampler)
```
### Streaming
```python
from datasets import load_dataset
from torch.utils.data import DataLoader
cv_16 = load_dataset("mozilla-foundation/common_voice_16_1", "hi", split="train")
dataloader = DataLoader(cv_16, batch_size=32)
```
To find out more about loading and preparing audio datasets, head over to [hf.co/blog/audio-datasets](https://huggingface.co/blog/audio-datasets).
### Example scripts
Train your own CTC or Seq2Seq Automatic Speech Recognition models on Common Voice 16 with `transformers` - [here](https://github.com/huggingface/transformers/tree/main/examples/pytorch/speech-recognition).
## Dataset Structure
### Data Instances
A typical data point comprises the `path` to the audio file and its `sentence`.
Additional fields include `accent`, `age`, `client_id`, `up_votes`, `down_votes`, `gender`, `locale` and `segment`.
```python
{
'client_id': 'd59478fbc1ee646a28a3c652a119379939123784d99131b865a89f8b21c81f69276c48bd574b81267d9d1a77b83b43e6d475a6cfc79c232ddbca946ae9c7afc5',
'path': 'et/clips/common_voice_et_18318995.mp3',
'audio': {
'path': 'et/clips/common_voice_et_18318995.mp3',
'array': array([-0.00048828, -0.00018311, -0.00137329, ..., 0.00079346, 0.00091553, 0.00085449], dtype=float32),
'sampling_rate': 48000
},
'sentence': 'Tasub kokku saada inimestega, keda tunned juba ammust ajast saati.',
'up_votes': 2,
'down_votes': 0,
'age': 'twenties',
'gender': 'male',
'accent': '',
'locale': 'et',
'segment': ''
}
```
### Data Fields
`client_id` (`string`): An id for which client (voice) made the recording
`path` (`string`): The path to the audio file
`audio` (`dict`): A dictionary containing the path to the downloaded audio file, the decoded audio array, and the sampling rate. Note that when accessing the audio column: `dataset[0]["audio"]` the audio file is automatically decoded and resampled to `dataset.features["audio"].sampling_rate`. Decoding and resampling of a large number of audio files might take a significant amount of time. Thus it is important to first query the sample index before the `"audio"` column, *i.e.* `dataset[0]["audio"]` should **always** be preferred over `dataset["audio"][0]`.
`sentence` (`string`): The sentence the user was prompted to speak
`up_votes` (`int64`): How many upvotes the audio file has received from reviewers
`down_votes` (`int64`): How many downvotes the audio file has received from reviewers
`age` (`string`): The age of the speaker (e.g. `teens`, `twenties`, `fifties`)
`gender` (`string`): The gender of the speaker
`accent` (`string`): Accent of the speaker
`locale` (`string`): The locale of the speaker
`segment` (`string`): Usually an empty field
### Data Splits
The speech material has been subdivided into portions for dev, train, test, validated, invalidated, reported and other.
The validated data is data that has been validated with reviewers and received upvotes that the data is of high quality.
The invalidated data is data has been invalidated by reviewers
and received downvotes indicating that the data is of low quality.
The reported data is data that has been reported, for different reasons.
The other data is data that has not yet been reviewed.
The dev, test, train are all data that has been reviewed, deemed of high quality and split into dev, test and train.
## Data Preprocessing Recommended by Hugging Face
The following are data preprocessing steps advised by the Hugging Face team. They are accompanied by an example code snippet that shows how to put them to practice.
Many examples in this dataset have trailing quotations marks, e.g _“the cat sat on the mat.“_. These trailing quotation marks do not change the actual meaning of the sentence, and it is near impossible to infer whether a sentence is a quotation or not a quotation from audio data alone. In these cases, it is advised to strip the quotation marks, leaving: _the cat sat on the mat_.
In addition, the majority of training sentences end in punctuation ( . or ? or ! ), whereas just a small proportion do not. In the dev set, **almost all** sentences end in punctuation. Thus, it is recommended to append a full-stop ( . ) to the end of the small number of training examples that do not end in punctuation.
```python
from datasets import load_dataset
ds = load_dataset("mozilla-foundation/common_voice_16_1", "en", use_auth_token=True)
def prepare_dataset(batch):
"""Function to preprocess the dataset with the .map method"""
transcription = batch["sentence"]
if transcription.startswith('"') and transcription.endswith('"'):
# we can remove trailing quotation marks as they do not affect the transcription
transcription = transcription[1:-1]
if transcription[-1] not in [".", "?", "!"]:
# append a full-stop to sentences that do not end in punctuation
transcription = transcription + "."
batch["sentence"] = transcription
return batch
ds = ds.map(prepare_dataset, desc="preprocess dataset")
```
## Dataset Creation
### Curation Rationale
[Needs More Information]
### Source Data
#### Initial Data Collection and Normalization
[Needs More Information]
#### Who are the source language producers?
[Needs More Information]
### Annotations
#### Annotation process
[Needs More Information]
#### Who are the annotators?
[Needs More Information]
### Personal and Sensitive Information
The dataset consists of people who have donated their voice online. You agree to not attempt to determine the identity of speakers in the Common Voice dataset.
## Considerations for Using the Data
### Social Impact of Dataset
The dataset consists of people who have donated their voice online. You agree to not attempt to determine the identity of speakers in the Common Voice dataset.
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
Public Domain, [CC-0](https://creativecommons.org/share-your-work/public-domain/cc0/)
### Citation Information
```
@inproceedings{commonvoice:2020,
author = {Ardila, R. and Branson, M. and Davis, K. and Henretty, M. and Kohler, M. and Meyer, J. and Morais, R. and Saunders, L. and Tyers, F. M. and Weber, G.},
title = {Common Voice: A Massively-Multilingual Speech Corpus},
booktitle = {Proceedings of the 12th Conference on Language Resources and Evaluation (LREC 2020)},
pages = {4211--4215},
year = 2020
}
```
| mozilla-foundation/common_voice_16_1 | [
"annotations_creators:crowdsourced",
"language_creators:crowdsourced",
"multilinguality:multilingual",
"language:ab",
"language:af",
"language:am",
"language:ar",
"language:as",
"language:ast",
"language:az",
"language:ba",
"language:bas",
"language:be",
"language:bg",
"language:bn",
"language:br",
"language:ca",
"language:ckb",
"language:cnh",
"language:cs",
"language:cv",
"language:cy",
"language:da",
"language:de",
"language:dv",
"language:dyu",
"language:el",
"language:en",
"language:eo",
"language:es",
"language:et",
"language:eu",
"language:fa",
"language:fi",
"language:fr",
"language:fy",
"language:ga",
"language:gl",
"language:gn",
"language:ha",
"language:he",
"language:hi",
"language:hsb",
"language:hu",
"language:hy",
"language:ia",
"language:id",
"language:ig",
"language:is",
"language:it",
"language:ja",
"language:ka",
"language:kab",
"language:kk",
"language:kmr",
"language:ko",
"language:ky",
"language:lg",
"language:lij",
"language:lo",
"language:lt",
"language:ltg",
"language:lv",
"language:mdf",
"language:mhr",
"language:mk",
"language:ml",
"language:mn",
"language:mr",
"language:mrj",
"language:mt",
"language:myv",
"language:nan",
"language:ne",
"language:nhi",
"language:nl",
"language:nn",
"language:oc",
"language:or",
"language:os",
"language:pa",
"language:pl",
"language:ps",
"language:pt",
"language:quy",
"language:rm",
"language:ro",
"language:ru",
"language:rw",
"language:sah",
"language:sat",
"language:sc",
"language:sk",
"language:skr",
"language:sl",
"language:sq",
"language:sr",
"language:sv",
"language:sw",
"language:ta",
"language:te",
"language:th",
"language:ti",
"language:tig",
"language:tk",
"language:tok",
"language:tr",
"language:tt",
"language:tw",
"language:ug",
"language:uk",
"language:ur",
"language:uz",
"language:vi",
"language:vot",
"language:yi",
"language:yo",
"language:yue",
"language:zgh",
"language:zh",
"license:cc0-1.0",
"arxiv:1912.06670",
"region:us"
] | 2024-01-15T12:20:41+00:00 | {"annotations_creators": ["crowdsourced"], "language_creators": ["crowdsourced"], "language": ["ab", "af", "am", "ar", "as", "ast", "az", "ba", "bas", "be", "bg", "bn", "br", "ca", "ckb", "cnh", "cs", "cv", "cy", "da", "de", "dv", "dyu", "el", "en", "eo", "es", "et", "eu", "fa", "fi", "fr", "fy", "ga", "gl", "gn", "ha", "he", "hi", "hsb", "hu", "hy", "ia", "id", "ig", "is", "it", "ja", "ka", "kab", "kk", "kmr", "ko", "ky", "lg", "lij", "lo", "lt", "ltg", "lv", "mdf", "mhr", "mk", "ml", "mn", "mr", "mrj", "mt", "myv", "nan", "ne", "nhi", "nl", "nn", "oc", "or", "os", "pa", "pl", "ps", "pt", "quy", "rm", "ro", "ru", "rw", "sah", "sat", "sc", "sk", "skr", "sl", "sq", "sr", "sv", "sw", "ta", "te", "th", "ti", "tig", "tk", "tok", "tr", "tt", "tw", "ug", "uk", "ur", "uz", "vi", "vot", "yi", "yo", "yue", "zgh", "zh"], "license": ["cc0-1.0"], "multilinguality": ["multilingual"], "paperswithcode_id": "common-voice", "pretty_name": "Common Voice Corpus 16.1", "language_bcp47": ["zh-CN", "zh-HK", "zh-TW", "sv-SE", "rm-sursilv", "rm-vallader", "pa-IN", "nn-NO", "ne-NP", "nan-tw", "hy-AM", "ga-IE", "fy-NL"], "extra_gated_prompt": "By clicking on \u201cAccess repository\u201d below, you also agree to not attempt to determine the identity of speakers in the Common Voice dataset."} | 2024-01-16T13:57:04+00:00 | [
"1912.06670"
] | [
"ab",
"af",
"am",
"ar",
"as",
"ast",
"az",
"ba",
"bas",
"be",
"bg",
"bn",
"br",
"ca",
"ckb",
"cnh",
"cs",
"cv",
"cy",
"da",
"de",
"dv",
"dyu",
"el",
"en",
"eo",
"es",
"et",
"eu",
"fa",
"fi",
"fr",
"fy",
"ga",
"gl",
"gn",
"ha",
"he",
"hi",
"hsb",
"hu",
"hy",
"ia",
"id",
"ig",
"is",
"it",
"ja",
"ka",
"kab",
"kk",
"kmr",
"ko",
"ky",
"lg",
"lij",
"lo",
"lt",
"ltg",
"lv",
"mdf",
"mhr",
"mk",
"ml",
"mn",
"mr",
"mrj",
"mt",
"myv",
"nan",
"ne",
"nhi",
"nl",
"nn",
"oc",
"or",
"os",
"pa",
"pl",
"ps",
"pt",
"quy",
"rm",
"ro",
"ru",
"rw",
"sah",
"sat",
"sc",
"sk",
"skr",
"sl",
"sq",
"sr",
"sv",
"sw",
"ta",
"te",
"th",
"ti",
"tig",
"tk",
"tok",
"tr",
"tt",
"tw",
"ug",
"uk",
"ur",
"uz",
"vi",
"vot",
"yi",
"yo",
"yue",
"zgh",
"zh"
] | TAGS
#annotations_creators-crowdsourced #language_creators-crowdsourced #multilinguality-multilingual #language-Abkhazian #language-Afrikaans #language-Amharic #language-Arabic #language-Assamese #language-Asturian #language-Azerbaijani #language-Bashkir #language-Basa (Cameroon) #language-Belarusian #language-Bulgarian #language-Bengali #language-Breton #language-Catalan #language-Central Kurdish #language-Hakha Chin #language-Czech #language-Chuvash #language-Welsh #language-Danish #language-German #language-Dhivehi #language-Dyula #language-Modern Greek (1453-) #language-English #language-Esperanto #language-Spanish #language-Estonian #language-Basque #language-Persian #language-Finnish #language-French #language-Western Frisian #language-Irish #language-Galician #language-Guarani #language-Hausa #language-Hebrew #language-Hindi #language-Upper Sorbian #language-Hungarian #language-Armenian #language-Interlingua (International Auxiliary Language Association) #language-Indonesian #language-Igbo #language-Icelandic #language-Italian #language-Japanese #language-Georgian #language-Kabyle #language-Kazakh #language-Northern Kurdish #language-Korean #language-Kirghiz #language-Ganda #language-Ligurian #language-Lao #language-Lithuanian #language-Latgalian #language-Latvian #language-Moksha #language-Eastern Mari #language-Macedonian #language-Malayalam #language-Mongolian #language-Marathi #language-Western Mari #language-Maltese #language-Erzya #language-Min Nan Chinese #language-Nepali (macrolanguage) #language-Zacatlán-Ahuacatlán-Tepetzintla Nahuatl #language-Dutch #language-Norwegian Nynorsk #language-Occitan (post 1500) #language-Oriya (macrolanguage) #language-Ossetian #language-Panjabi #language-Polish #language-Pushto #language-Portuguese #language-Ayacucho Quechua #language-Romansh #language-Romanian #language-Russian #language-Kinyarwanda #language-Yakut #language-Santali #language-Sardinian #language-Slovak #language-Saraiki #language-Slovenian #language-Albanian #language-Serbian #language-Swedish #language-Swahili (macrolanguage) #language-Tamil #language-Telugu #language-Thai #language-Tigrinya #language-Tigre #language-Turkmen #language-Toki Pona #language-Turkish #language-Tatar #language-Twi #language-Uighur #language-Ukrainian #language-Urdu #language-Uzbek #language-Vietnamese #language-Votic #language-Yiddish #language-Yoruba #language-Yue Chinese #language-Standard Moroccan Tamazight #language-Chinese #license-cc0-1.0 #arxiv-1912.06670 #region-us
|
# Dataset Card for Common Voice Corpus 16
## Table of Contents
- Dataset Description
- Dataset Summary
- Supported Tasks and Leaderboards
- Languages
- Dataset Structure
- Data Instances
- Data Fields
- Data Splits
- Dataset Creation
- Curation Rationale
- Source Data
- Annotations
- Personal and Sensitive Information
- Considerations for Using the Data
- Social Impact of Dataset
- Discussion of Biases
- Other Known Limitations
- Additional Information
- Dataset Curators
- Licensing Information
- Citation Information
- Contributions
## Dataset Description
- Homepage: URL
- Repository: URL
- Paper: URL
- Leaderboard: URL
- Point of Contact: Vaibhav Srivastav
### Dataset Summary
The Common Voice dataset consists of a unique MP3 and corresponding text file.
Many of the 30328 recorded hours in the dataset also include demographic metadata like age, sex, and accent
that can help improve the accuracy of speech recognition engines.
The dataset currently consists of 19673 validated hours in 120 languages, but more voices and languages are always added.
Take a look at the Languages page to request a language or start contributing.
### Languages
## How to use
The 'datasets' library allows you to load and pre-process your dataset in pure Python, at scale. The dataset can be downloaded and prepared in one call to your local drive by using the 'load_dataset' function.
For example, to download the Hindi config, simply specify the corresponding language config name (i.e., "hi" for Hindi):
Using the datasets library, you can also stream the dataset on-the-fly by adding a 'streaming=True' argument to the 'load_dataset' function call. Loading a dataset in streaming mode loads individual samples of the dataset at a time, rather than downloading the entire dataset to disk.
*Bonus*: create a PyTorch dataloader directly with your own datasets (local/streamed).
### Local
### Streaming
To find out more about loading and preparing audio datasets, head over to URL
### Example scripts
Train your own CTC or Seq2Seq Automatic Speech Recognition models on Common Voice 16 with 'transformers' - here.
## Dataset Structure
### Data Instances
A typical data point comprises the 'path' to the audio file and its 'sentence'.
Additional fields include 'accent', 'age', 'client_id', 'up_votes', 'down_votes', 'gender', 'locale' and 'segment'.
### Data Fields
'client_id' ('string'): An id for which client (voice) made the recording
'path' ('string'): The path to the audio file
'audio' ('dict'): A dictionary containing the path to the downloaded audio file, the decoded audio array, and the sampling rate. Note that when accessing the audio column: 'dataset[0]["audio"]' the audio file is automatically decoded and resampled to 'dataset.features["audio"].sampling_rate'. Decoding and resampling of a large number of audio files might take a significant amount of time. Thus it is important to first query the sample index before the '"audio"' column, *i.e.* 'dataset[0]["audio"]' should always be preferred over 'dataset["audio"][0]'.
'sentence' ('string'): The sentence the user was prompted to speak
'up_votes' ('int64'): How many upvotes the audio file has received from reviewers
'down_votes' ('int64'): How many downvotes the audio file has received from reviewers
'age' ('string'): The age of the speaker (e.g. 'teens', 'twenties', 'fifties')
'gender' ('string'): The gender of the speaker
'accent' ('string'): Accent of the speaker
'locale' ('string'): The locale of the speaker
'segment' ('string'): Usually an empty field
### Data Splits
The speech material has been subdivided into portions for dev, train, test, validated, invalidated, reported and other.
The validated data is data that has been validated with reviewers and received upvotes that the data is of high quality.
The invalidated data is data has been invalidated by reviewers
and received downvotes indicating that the data is of low quality.
The reported data is data that has been reported, for different reasons.
The other data is data that has not yet been reviewed.
The dev, test, train are all data that has been reviewed, deemed of high quality and split into dev, test and train.
## Data Preprocessing Recommended by Hugging Face
The following are data preprocessing steps advised by the Hugging Face team. They are accompanied by an example code snippet that shows how to put them to practice.
Many examples in this dataset have trailing quotations marks, e.g _“the cat sat on the mat.“_. These trailing quotation marks do not change the actual meaning of the sentence, and it is near impossible to infer whether a sentence is a quotation or not a quotation from audio data alone. In these cases, it is advised to strip the quotation marks, leaving: _the cat sat on the mat_.
In addition, the majority of training sentences end in punctuation ( . or ? or ! ), whereas just a small proportion do not. In the dev set, almost all sentences end in punctuation. Thus, it is recommended to append a full-stop ( . ) to the end of the small number of training examples that do not end in punctuation.
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
The dataset consists of people who have donated their voice online. You agree to not attempt to determine the identity of speakers in the Common Voice dataset.
## Considerations for Using the Data
### Social Impact of Dataset
The dataset consists of people who have donated their voice online. You agree to not attempt to determine the identity of speakers in the Common Voice dataset.
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
Public Domain, CC-0
| [
"# Dataset Card for Common Voice Corpus 16",
"## Table of Contents\n- Dataset Description\n - Dataset Summary\n - Supported Tasks and Leaderboards\n - Languages\n- Dataset Structure\n - Data Instances\n - Data Fields\n - Data Splits\n- Dataset Creation\n - Curation Rationale\n - Source Data\n - Annotations\n - Personal and Sensitive Information\n- Considerations for Using the Data\n - Social Impact of Dataset\n - Discussion of Biases\n - Other Known Limitations\n- Additional Information\n - Dataset Curators\n - Licensing Information\n - Citation Information\n - Contributions",
"## Dataset Description\n\n- Homepage: URL\n- Repository: URL\n- Paper: URL\n- Leaderboard: URL\n- Point of Contact: Vaibhav Srivastav",
"### Dataset Summary\n\nThe Common Voice dataset consists of a unique MP3 and corresponding text file. \nMany of the 30328 recorded hours in the dataset also include demographic metadata like age, sex, and accent \nthat can help improve the accuracy of speech recognition engines.\n\nThe dataset currently consists of 19673 validated hours in 120 languages, but more voices and languages are always added. \nTake a look at the Languages page to request a language or start contributing.",
"### Languages",
"## How to use\n\nThe 'datasets' library allows you to load and pre-process your dataset in pure Python, at scale. The dataset can be downloaded and prepared in one call to your local drive by using the 'load_dataset' function. \n\nFor example, to download the Hindi config, simply specify the corresponding language config name (i.e., \"hi\" for Hindi):\n\n\nUsing the datasets library, you can also stream the dataset on-the-fly by adding a 'streaming=True' argument to the 'load_dataset' function call. Loading a dataset in streaming mode loads individual samples of the dataset at a time, rather than downloading the entire dataset to disk.\n\n\n*Bonus*: create a PyTorch dataloader directly with your own datasets (local/streamed).",
"### Local",
"### Streaming\n\n\n\nTo find out more about loading and preparing audio datasets, head over to URL",
"### Example scripts\n\nTrain your own CTC or Seq2Seq Automatic Speech Recognition models on Common Voice 16 with 'transformers' - here.",
"## Dataset Structure",
"### Data Instances\n\nA typical data point comprises the 'path' to the audio file and its 'sentence'. \nAdditional fields include 'accent', 'age', 'client_id', 'up_votes', 'down_votes', 'gender', 'locale' and 'segment'.",
"### Data Fields\n\n'client_id' ('string'): An id for which client (voice) made the recording\n\n'path' ('string'): The path to the audio file\n\n'audio' ('dict'): A dictionary containing the path to the downloaded audio file, the decoded audio array, and the sampling rate. Note that when accessing the audio column: 'dataset[0][\"audio\"]' the audio file is automatically decoded and resampled to 'dataset.features[\"audio\"].sampling_rate'. Decoding and resampling of a large number of audio files might take a significant amount of time. Thus it is important to first query the sample index before the '\"audio\"' column, *i.e.* 'dataset[0][\"audio\"]' should always be preferred over 'dataset[\"audio\"][0]'.\n\n'sentence' ('string'): The sentence the user was prompted to speak\n\n'up_votes' ('int64'): How many upvotes the audio file has received from reviewers\n\n'down_votes' ('int64'): How many downvotes the audio file has received from reviewers\n\n'age' ('string'): The age of the speaker (e.g. 'teens', 'twenties', 'fifties')\n\n'gender' ('string'): The gender of the speaker\n\n'accent' ('string'): Accent of the speaker\n\n'locale' ('string'): The locale of the speaker\n\n'segment' ('string'): Usually an empty field",
"### Data Splits\n\nThe speech material has been subdivided into portions for dev, train, test, validated, invalidated, reported and other.\n\nThe validated data is data that has been validated with reviewers and received upvotes that the data is of high quality.\n\nThe invalidated data is data has been invalidated by reviewers\nand received downvotes indicating that the data is of low quality.\n\nThe reported data is data that has been reported, for different reasons.\n\nThe other data is data that has not yet been reviewed.\n\nThe dev, test, train are all data that has been reviewed, deemed of high quality and split into dev, test and train.",
"## Data Preprocessing Recommended by Hugging Face\n\nThe following are data preprocessing steps advised by the Hugging Face team. They are accompanied by an example code snippet that shows how to put them to practice. \n\nMany examples in this dataset have trailing quotations marks, e.g _“the cat sat on the mat.“_. These trailing quotation marks do not change the actual meaning of the sentence, and it is near impossible to infer whether a sentence is a quotation or not a quotation from audio data alone. In these cases, it is advised to strip the quotation marks, leaving: _the cat sat on the mat_.\n\nIn addition, the majority of training sentences end in punctuation ( . or ? or ! ), whereas just a small proportion do not. In the dev set, almost all sentences end in punctuation. Thus, it is recommended to append a full-stop ( . ) to the end of the small number of training examples that do not end in punctuation.",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information\n\nThe dataset consists of people who have donated their voice online. You agree to not attempt to determine the identity of speakers in the Common Voice dataset.",
"## Considerations for Using the Data",
"### Social Impact of Dataset\n\nThe dataset consists of people who have donated their voice online. You agree to not attempt to determine the identity of speakers in the Common Voice dataset.",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information\n\nPublic Domain, CC-0"
] | [
"TAGS\n#annotations_creators-crowdsourced #language_creators-crowdsourced #multilinguality-multilingual #language-Abkhazian #language-Afrikaans #language-Amharic #language-Arabic #language-Assamese #language-Asturian #language-Azerbaijani #language-Bashkir #language-Basa (Cameroon) #language-Belarusian #language-Bulgarian #language-Bengali #language-Breton #language-Catalan #language-Central Kurdish #language-Hakha Chin #language-Czech #language-Chuvash #language-Welsh #language-Danish #language-German #language-Dhivehi #language-Dyula #language-Modern Greek (1453-) #language-English #language-Esperanto #language-Spanish #language-Estonian #language-Basque #language-Persian #language-Finnish #language-French #language-Western Frisian #language-Irish #language-Galician #language-Guarani #language-Hausa #language-Hebrew #language-Hindi #language-Upper Sorbian #language-Hungarian #language-Armenian #language-Interlingua (International Auxiliary Language Association) #language-Indonesian #language-Igbo #language-Icelandic #language-Italian #language-Japanese #language-Georgian #language-Kabyle #language-Kazakh #language-Northern Kurdish #language-Korean #language-Kirghiz #language-Ganda #language-Ligurian #language-Lao #language-Lithuanian #language-Latgalian #language-Latvian #language-Moksha #language-Eastern Mari #language-Macedonian #language-Malayalam #language-Mongolian #language-Marathi #language-Western Mari #language-Maltese #language-Erzya #language-Min Nan Chinese #language-Nepali (macrolanguage) #language-Zacatlán-Ahuacatlán-Tepetzintla Nahuatl #language-Dutch #language-Norwegian Nynorsk #language-Occitan (post 1500) #language-Oriya (macrolanguage) #language-Ossetian #language-Panjabi #language-Polish #language-Pushto #language-Portuguese #language-Ayacucho Quechua #language-Romansh #language-Romanian #language-Russian #language-Kinyarwanda #language-Yakut #language-Santali #language-Sardinian #language-Slovak #language-Saraiki #language-Slovenian #language-Albanian #language-Serbian #language-Swedish #language-Swahili (macrolanguage) #language-Tamil #language-Telugu #language-Thai #language-Tigrinya #language-Tigre #language-Turkmen #language-Toki Pona #language-Turkish #language-Tatar #language-Twi #language-Uighur #language-Ukrainian #language-Urdu #language-Uzbek #language-Vietnamese #language-Votic #language-Yiddish #language-Yoruba #language-Yue Chinese #language-Standard Moroccan Tamazight #language-Chinese #license-cc0-1.0 #arxiv-1912.06670 #region-us \n",
"# Dataset Card for Common Voice Corpus 16",
"## Table of Contents\n- Dataset Description\n - Dataset Summary\n - Supported Tasks and Leaderboards\n - Languages\n- Dataset Structure\n - Data Instances\n - Data Fields\n - Data Splits\n- Dataset Creation\n - Curation Rationale\n - Source Data\n - Annotations\n - Personal and Sensitive Information\n- Considerations for Using the Data\n - Social Impact of Dataset\n - Discussion of Biases\n - Other Known Limitations\n- Additional Information\n - Dataset Curators\n - Licensing Information\n - Citation Information\n - Contributions",
"## Dataset Description\n\n- Homepage: URL\n- Repository: URL\n- Paper: URL\n- Leaderboard: URL\n- Point of Contact: Vaibhav Srivastav",
"### Dataset Summary\n\nThe Common Voice dataset consists of a unique MP3 and corresponding text file. \nMany of the 30328 recorded hours in the dataset also include demographic metadata like age, sex, and accent \nthat can help improve the accuracy of speech recognition engines.\n\nThe dataset currently consists of 19673 validated hours in 120 languages, but more voices and languages are always added. \nTake a look at the Languages page to request a language or start contributing.",
"### Languages",
"## How to use\n\nThe 'datasets' library allows you to load and pre-process your dataset in pure Python, at scale. The dataset can be downloaded and prepared in one call to your local drive by using the 'load_dataset' function. \n\nFor example, to download the Hindi config, simply specify the corresponding language config name (i.e., \"hi\" for Hindi):\n\n\nUsing the datasets library, you can also stream the dataset on-the-fly by adding a 'streaming=True' argument to the 'load_dataset' function call. Loading a dataset in streaming mode loads individual samples of the dataset at a time, rather than downloading the entire dataset to disk.\n\n\n*Bonus*: create a PyTorch dataloader directly with your own datasets (local/streamed).",
"### Local",
"### Streaming\n\n\n\nTo find out more about loading and preparing audio datasets, head over to URL",
"### Example scripts\n\nTrain your own CTC or Seq2Seq Automatic Speech Recognition models on Common Voice 16 with 'transformers' - here.",
"## Dataset Structure",
"### Data Instances\n\nA typical data point comprises the 'path' to the audio file and its 'sentence'. \nAdditional fields include 'accent', 'age', 'client_id', 'up_votes', 'down_votes', 'gender', 'locale' and 'segment'.",
"### Data Fields\n\n'client_id' ('string'): An id for which client (voice) made the recording\n\n'path' ('string'): The path to the audio file\n\n'audio' ('dict'): A dictionary containing the path to the downloaded audio file, the decoded audio array, and the sampling rate. Note that when accessing the audio column: 'dataset[0][\"audio\"]' the audio file is automatically decoded and resampled to 'dataset.features[\"audio\"].sampling_rate'. Decoding and resampling of a large number of audio files might take a significant amount of time. Thus it is important to first query the sample index before the '\"audio\"' column, *i.e.* 'dataset[0][\"audio\"]' should always be preferred over 'dataset[\"audio\"][0]'.\n\n'sentence' ('string'): The sentence the user was prompted to speak\n\n'up_votes' ('int64'): How many upvotes the audio file has received from reviewers\n\n'down_votes' ('int64'): How many downvotes the audio file has received from reviewers\n\n'age' ('string'): The age of the speaker (e.g. 'teens', 'twenties', 'fifties')\n\n'gender' ('string'): The gender of the speaker\n\n'accent' ('string'): Accent of the speaker\n\n'locale' ('string'): The locale of the speaker\n\n'segment' ('string'): Usually an empty field",
"### Data Splits\n\nThe speech material has been subdivided into portions for dev, train, test, validated, invalidated, reported and other.\n\nThe validated data is data that has been validated with reviewers and received upvotes that the data is of high quality.\n\nThe invalidated data is data has been invalidated by reviewers\nand received downvotes indicating that the data is of low quality.\n\nThe reported data is data that has been reported, for different reasons.\n\nThe other data is data that has not yet been reviewed.\n\nThe dev, test, train are all data that has been reviewed, deemed of high quality and split into dev, test and train.",
"## Data Preprocessing Recommended by Hugging Face\n\nThe following are data preprocessing steps advised by the Hugging Face team. They are accompanied by an example code snippet that shows how to put them to practice. \n\nMany examples in this dataset have trailing quotations marks, e.g _“the cat sat on the mat.“_. These trailing quotation marks do not change the actual meaning of the sentence, and it is near impossible to infer whether a sentence is a quotation or not a quotation from audio data alone. In these cases, it is advised to strip the quotation marks, leaving: _the cat sat on the mat_.\n\nIn addition, the majority of training sentences end in punctuation ( . or ? or ! ), whereas just a small proportion do not. In the dev set, almost all sentences end in punctuation. Thus, it is recommended to append a full-stop ( . ) to the end of the small number of training examples that do not end in punctuation.",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information\n\nThe dataset consists of people who have donated their voice online. You agree to not attempt to determine the identity of speakers in the Common Voice dataset.",
"## Considerations for Using the Data",
"### Social Impact of Dataset\n\nThe dataset consists of people who have donated their voice online. You agree to not attempt to determine the identity of speakers in the Common Voice dataset.",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information\n\nPublic Domain, CC-0"
] |
Subsets and Splits