sha
stringlengths 40
40
| text
stringlengths 0
13.4M
| id
stringlengths 2
117
| tags
sequence | created_at
stringlengths 25
25
| metadata
stringlengths 2
31.7M
| last_modified
stringlengths 25
25
|
---|---|---|---|---|---|---|
749b5f97bfa32e21caa639d871acdd22e84e37eb | AI-Golden/aigolden-model | [
"license:apache-2.0",
"region:us"
] | 2024-01-14T13:30:57+00:00 | {"license": "apache-2.0"} | 2024-01-14T13:35:21+00:00 |
|
3e4f8c55c55d908fcdc3a3a118126cabce6e3b09 |
# Dataset of stremitelny/ストレミテルヌイ/神速 (Azur Lane)
This is the dataset of stremitelny/ストレミテルヌイ/神速 (Azur Lane), containing 15 images and their tags.
The core tags of this character are `red_eyes, white_hair, long_hair, hair_between_eyes, bangs, antenna_hair, hat, ahoge, very_long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 15 | 22.89 MiB | [Download](https://huggingface.co/datasets/CyberHarem/stremitelny_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 15 | 13.45 MiB | [Download](https://huggingface.co/datasets/CyberHarem/stremitelny_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 38 | 28.28 MiB | [Download](https://huggingface.co/datasets/CyberHarem/stremitelny_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 15 | 20.37 MiB | [Download](https://huggingface.co/datasets/CyberHarem/stremitelny_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 38 | 41.48 MiB | [Download](https://huggingface.co/datasets/CyberHarem/stremitelny_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/stremitelny_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 |  |  |  |  |  | 1girl, solo, long_sleeves, looking_at_viewer, blush, open_mouth, white_coat, :d, black_pantyhose, fur-trimmed_coat, simple_background, white_background, white_headwear |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | long_sleeves | looking_at_viewer | blush | open_mouth | white_coat | :d | black_pantyhose | fur-trimmed_coat | simple_background | white_background | white_headwear |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:---------------|:--------------------|:--------|:-------------|:-------------|:-----|:------------------|:-------------------|:--------------------|:-------------------|:-----------------|
| 0 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/stremitelny_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T13:31:03+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T13:35:33+00:00 |
60cc109811fd4dc6960ed95028b5c9f697894c47 |
# Dataset of stanly/スタンリー/斯坦利 (Azur Lane)
This is the dataset of stanly/スタンリー/斯坦利 (Azur Lane), containing 15 images and their tags.
The core tags of this character are `long_hair, purple_eyes, pink_hair, headband, hair_between_eyes, hairband, bangs, pink_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 15 | 14.35 MiB | [Download](https://huggingface.co/datasets/CyberHarem/stanly_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 15 | 9.51 MiB | [Download](https://huggingface.co/datasets/CyberHarem/stanly_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 34 | 17.29 MiB | [Download](https://huggingface.co/datasets/CyberHarem/stanly_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 15 | 12.65 MiB | [Download](https://huggingface.co/datasets/CyberHarem/stanly_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 34 | 21.96 MiB | [Download](https://huggingface.co/datasets/CyberHarem/stanly_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/stanly_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------|
| 0 | 15 |  |  |  |  |  | 1girl, looking_at_viewer, jacket, solo, smile, blush, single_thighhigh, necktie, white_background, simple_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | jacket | solo | smile | blush | single_thighhigh | necktie | white_background | simple_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:---------|:-------|:--------|:--------|:-------------------|:----------|:-------------------|:--------------------|
| 0 | 15 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/stanly_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T13:31:06+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T13:34:59+00:00 |
214063b0119eca7586d7308d3a28046e3b7a452a | spellingdragon/common_voice_9_zh-TW_simple-whisper-large-v3 | [
"region:us"
] | 2024-01-14T13:31:13+00:00 | {"dataset_info": {"features": [{"name": "input_features", "sequence": {"sequence": "float32"}}, {"name": "labels", "sequence": "int64"}, {"name": "input_length", "dtype": "float64"}], "splits": [{"name": "train", "num_bytes": 24016183040.0, "num_examples": 15629}, {"name": "test", "num_bytes": 14978536672, "num_examples": 9747}], "download_size": 5997112102, "dataset_size": 38994719712.0}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2024-01-14T13:38:23+00:00 |
|
991405bf65275118be77348f1d62af69210e6011 | Sussyb/Querogoza | [
"region:us"
] | 2024-01-14T13:34:56+00:00 | {} | 2024-01-14T14:18:41+00:00 |
|
4efb8da236046b1043272931e15f8450e75d205c |
# Downloading this Options Dataset
This document will guide you through the steps to download the Merval options dataset from Hugging Face Datasets.
To start, you'll need to install Hugging Face's `datasets` library if you haven't done so already.
You can do this using the following pip command:
```python
!pip install datasets
```
Here's the Python code to load the Merval equity dataset from Hugging Face Datasets and convert it into a pandas DataFrame:
```python
from datasets import load_dataset
import pandas as pd
id = "gauss314/opciones"
data = load_dataset(id)
df = pd.DataFrame(data['train'][:])
``` | gauss314/opciones | [
"task_categories:tabular-classification",
"task_categories:tabular-regression",
"license:apache-2.0",
"Merval",
"options",
"region:us"
] | 2024-01-14T13:38:42+00:00 | {"license": "apache-2.0", "task_categories": ["tabular-classification", "tabular-regression"], "pretty_name": "Merval historical options data, for deep learning and machine learning tests", "tags": ["Merval", "options"]} | 2024-01-14T13:46:43+00:00 |
a57e4b4f7cb4cf05e08b45946620572875aa88b3 |
# Dataset of bearn/ベアルン/贝亚恩 (Azur Lane)
This is the dataset of bearn/ベアルン/贝亚恩 (Azur Lane), containing 13 images and their tags.
The core tags of this character are `bangs, breasts, small_breasts, multicolored_hair, short_hair, horns, black_hair, glasses, grey_hair, blunt_bangs, grey_eyes, purple_hair, streaked_hair, two-tone_hair, blue_eyes, hairband`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 13 | 14.55 MiB | [Download](https://huggingface.co/datasets/CyberHarem/bearn_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 13 | 8.59 MiB | [Download](https://huggingface.co/datasets/CyberHarem/bearn_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 23 | 14.77 MiB | [Download](https://huggingface.co/datasets/CyberHarem/bearn_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 13 | 12.46 MiB | [Download](https://huggingface.co/datasets/CyberHarem/bearn_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 23 | 20.87 MiB | [Download](https://huggingface.co/datasets/CyberHarem/bearn_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/bearn_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 13 |  |  |  |  |  | 1girl, looking_at_viewer, solo, monocle, bare_shoulders, holding, simple_background, black_gloves, blush, closed_mouth, covered_navel, long_sleeves, thighhighs, dress, full_body, jacket, off_shoulder, swimsuit, thigh_boots, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | monocle | bare_shoulders | holding | simple_background | black_gloves | blush | closed_mouth | covered_navel | long_sleeves | thighhighs | dress | full_body | jacket | off_shoulder | swimsuit | thigh_boots | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:----------|:-----------------|:----------|:--------------------|:---------------|:--------|:---------------|:----------------|:---------------|:-------------|:--------|:------------|:---------|:---------------|:-----------|:--------------|:-------------------|
| 0 | 13 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/bearn_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T13:46:10+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T13:49:14+00:00 |
769b1b78c9dd7814175057cb4e9b759a0664e1e3 |
# Dataset Card for Evaluation run of FelixChao/MathDolphin-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [FelixChao/MathDolphin-7B](https://huggingface.co/FelixChao/MathDolphin-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_FelixChao__MathDolphin-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T13:48:07.624647](https://huggingface.co/datasets/open-llm-leaderboard/details_FelixChao__MathDolphin-7B/blob/main/results_2024-01-14T13-48-07.624647.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.65315875756261,
"acc_stderr": 0.03196302131707709,
"acc_norm": 0.6538202133223454,
"acc_norm_stderr": 0.03261743110455684,
"mc1": 0.3708690330477356,
"mc1_stderr": 0.01690969358024882,
"mc2": 0.5291514968771067,
"mc2_stderr": 0.015285199336849235
},
"harness|arc:challenge|25": {
"acc": 0.6279863481228669,
"acc_stderr": 0.014124597881844461,
"acc_norm": 0.658703071672355,
"acc_norm_stderr": 0.01385583128749773
},
"harness|hellaswag|10": {
"acc": 0.6622186815375424,
"acc_stderr": 0.004719870074967248,
"acc_norm": 0.8549093806014738,
"acc_norm_stderr": 0.0035147239847366034
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.75,
"acc_stderr": 0.03523807393012047,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03523807393012047
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.02783491252754407,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.02783491252754407
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.03669072477416907,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.03669072477416907
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.048786087144669955,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.048786087144669955
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.025467149045469553,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.025467149045469553
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.023664216671642514,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.023664216671642514
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.02962022787479049,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.02962022787479049
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9222797927461139,
"acc_stderr": 0.019321805557223168,
"acc_norm": 0.9222797927461139,
"acc_norm_stderr": 0.019321805557223168
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6692307692307692,
"acc_stderr": 0.023854795680971118,
"acc_norm": 0.6692307692307692,
"acc_norm_stderr": 0.023854795680971118
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028742040903948482,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028742040903948482
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6932773109243697,
"acc_stderr": 0.029953823891887034,
"acc_norm": 0.6932773109243697,
"acc_norm_stderr": 0.029953823891887034
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8495412844036697,
"acc_stderr": 0.015328563932669237,
"acc_norm": 0.8495412844036697,
"acc_norm_stderr": 0.015328563932669237
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8284313725490197,
"acc_stderr": 0.02646056956124064,
"acc_norm": 0.8284313725490197,
"acc_norm_stderr": 0.02646056956124064
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7130044843049327,
"acc_stderr": 0.03036037971029195,
"acc_norm": 0.7130044843049327,
"acc_norm_stderr": 0.03036037971029195
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.03498149385462472,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.03498149385462472
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7914110429447853,
"acc_stderr": 0.03192193448934724,
"acc_norm": 0.7914110429447853,
"acc_norm_stderr": 0.03192193448934724
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281372,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281372
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8275862068965517,
"acc_stderr": 0.013507943909371803,
"acc_norm": 0.8275862068965517,
"acc_norm_stderr": 0.013507943909371803
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7514450867052023,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.7514450867052023,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3854748603351955,
"acc_stderr": 0.01627792703963819,
"acc_norm": 0.3854748603351955,
"acc_norm_stderr": 0.01627792703963819
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7418300653594772,
"acc_stderr": 0.025058503316958143,
"acc_norm": 0.7418300653594772,
"acc_norm_stderr": 0.025058503316958143
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7234726688102894,
"acc_stderr": 0.025403832978179604,
"acc_norm": 0.7234726688102894,
"acc_norm_stderr": 0.025403832978179604
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.023468429832451152,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.023468429832451152
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47392438070404175,
"acc_stderr": 0.012752858346533133,
"acc_norm": 0.47392438070404175,
"acc_norm_stderr": 0.012752858346533133
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.028245687391462937,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.028245687391462937
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6813725490196079,
"acc_stderr": 0.01885008469646872,
"acc_norm": 0.6813725490196079,
"acc_norm_stderr": 0.01885008469646872
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8606965174129353,
"acc_stderr": 0.024484487162913973,
"acc_norm": 0.8606965174129353,
"acc_norm_stderr": 0.024484487162913973
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640044,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640044
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3708690330477356,
"mc1_stderr": 0.01690969358024882,
"mc2": 0.5291514968771067,
"mc2_stderr": 0.015285199336849235
},
"harness|winogrande|5": {
"acc": 0.8121546961325967,
"acc_stderr": 0.010977481103435091
},
"harness|gsm8k|5": {
"acc": 0.6785443517816527,
"acc_stderr": 0.012864471384836703
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_FelixChao__MathDolphin-7B | [
"region:us"
] | 2024-01-14T13:50:28+00:00 | {"pretty_name": "Evaluation run of FelixChao/MathDolphin-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [FelixChao/MathDolphin-7B](https://huggingface.co/FelixChao/MathDolphin-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_FelixChao__MathDolphin-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-14T13:48:07.624647](https://huggingface.co/datasets/open-llm-leaderboard/details_FelixChao__MathDolphin-7B/blob/main/results_2024-01-14T13-48-07.624647.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.65315875756261,\n \"acc_stderr\": 0.03196302131707709,\n \"acc_norm\": 0.6538202133223454,\n \"acc_norm_stderr\": 0.03261743110455684,\n \"mc1\": 0.3708690330477356,\n \"mc1_stderr\": 0.01690969358024882,\n \"mc2\": 0.5291514968771067,\n \"mc2_stderr\": 0.015285199336849235\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6279863481228669,\n \"acc_stderr\": 0.014124597881844461,\n \"acc_norm\": 0.658703071672355,\n \"acc_norm_stderr\": 0.01385583128749773\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6622186815375424,\n \"acc_stderr\": 0.004719870074967248,\n \"acc_norm\": 0.8549093806014738,\n \"acc_norm_stderr\": 0.0035147239847366034\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03523807393012047,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03523807393012047\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.02783491252754407,\n \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.02783491252754407\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n \"acc_stderr\": 0.03669072477416907,\n \"acc_norm\": 0.6358381502890174,\n \"acc_norm_stderr\": 0.03669072477416907\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.048786087144669955,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.048786087144669955\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42592592592592593,\n \"acc_stderr\": 0.025467149045469553,\n \"acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.025467149045469553\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7774193548387097,\n \"acc_stderr\": 0.023664216671642514,\n \"acc_norm\": 0.7774193548387097,\n \"acc_norm_stderr\": 0.023664216671642514\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.02962022787479049,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.02962022787479049\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9222797927461139,\n \"acc_stderr\": 0.019321805557223168,\n \"acc_norm\": 0.9222797927461139,\n \"acc_norm_stderr\": 0.019321805557223168\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6692307692307692,\n \"acc_stderr\": 0.023854795680971118,\n \"acc_norm\": 0.6692307692307692,\n \"acc_norm_stderr\": 0.023854795680971118\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948482,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948482\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.029953823891887034,\n \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.029953823891887034\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8495412844036697,\n \"acc_stderr\": 0.015328563932669237,\n \"acc_norm\": 0.8495412844036697,\n \"acc_norm_stderr\": 0.015328563932669237\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8284313725490197,\n \"acc_stderr\": 0.02646056956124064,\n \"acc_norm\": 0.8284313725490197,\n \"acc_norm_stderr\": 0.02646056956124064\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7130044843049327,\n \"acc_stderr\": 0.03036037971029195,\n \"acc_norm\": 0.7130044843049327,\n \"acc_norm_stderr\": 0.03036037971029195\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.03498149385462472,\n \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.03498149385462472\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.03192193448934724,\n \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.03192193448934724\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n \"acc_stderr\": 0.021586494001281372,\n \"acc_norm\": 0.8760683760683761,\n \"acc_norm_stderr\": 0.021586494001281372\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8275862068965517,\n \"acc_stderr\": 0.013507943909371803,\n \"acc_norm\": 0.8275862068965517,\n \"acc_norm_stderr\": 0.013507943909371803\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7514450867052023,\n \"acc_stderr\": 0.023267528432100174,\n \"acc_norm\": 0.7514450867052023,\n \"acc_norm_stderr\": 0.023267528432100174\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3854748603351955,\n \"acc_stderr\": 0.01627792703963819,\n \"acc_norm\": 0.3854748603351955,\n \"acc_norm_stderr\": 0.01627792703963819\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7418300653594772,\n \"acc_stderr\": 0.025058503316958143,\n \"acc_norm\": 0.7418300653594772,\n \"acc_norm_stderr\": 0.025058503316958143\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7234726688102894,\n \"acc_stderr\": 0.025403832978179604,\n \"acc_norm\": 0.7234726688102894,\n \"acc_norm_stderr\": 0.025403832978179604\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.023468429832451152,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.023468429832451152\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47392438070404175,\n \"acc_stderr\": 0.012752858346533133,\n \"acc_norm\": 0.47392438070404175,\n \"acc_norm_stderr\": 0.012752858346533133\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462937,\n \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462937\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6813725490196079,\n \"acc_stderr\": 0.01885008469646872,\n \"acc_norm\": 0.6813725490196079,\n \"acc_norm_stderr\": 0.01885008469646872\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8606965174129353,\n \"acc_stderr\": 0.024484487162913973,\n \"acc_norm\": 0.8606965174129353,\n \"acc_norm_stderr\": 0.024484487162913973\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640044,\n \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640044\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3708690330477356,\n \"mc1_stderr\": 0.01690969358024882,\n \"mc2\": 0.5291514968771067,\n \"mc2_stderr\": 0.015285199336849235\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8121546961325967,\n \"acc_stderr\": 0.010977481103435091\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6785443517816527,\n \"acc_stderr\": 0.012864471384836703\n }\n}\n```", "repo_url": "https://huggingface.co/FelixChao/MathDolphin-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_14T13_48_07.624647", "path": ["**/details_harness|arc:challenge|25_2024-01-14T13-48-07.624647.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-14T13-48-07.624647.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_14T13_48_07.624647", "path": ["**/details_harness|gsm8k|5_2024-01-14T13-48-07.624647.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-14T13-48-07.624647.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_14T13_48_07.624647", "path": ["**/details_harness|hellaswag|10_2024-01-14T13-48-07.624647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-14T13-48-07.624647.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_14T13_48_07.624647", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T13-48-07.624647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T13-48-07.624647.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T13-48-07.624647.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_14T13_48_07.624647", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T13-48-07.624647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T13-48-07.624647.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_14T13_48_07.624647", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T13-48-07.624647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T13-48-07.624647.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_14T13_48_07.624647", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T13-48-07.624647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T13-48-07.624647.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_14T13_48_07.624647", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T13-48-07.624647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T13-48-07.624647.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_14T13_48_07.624647", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T13-48-07.624647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T13-48-07.624647.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_14T13_48_07.624647", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T13-48-07.624647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T13-48-07.624647.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_14T13_48_07.624647", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T13-48-07.624647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T13-48-07.624647.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_14T13_48_07.624647", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T13-48-07.624647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T13-48-07.624647.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_14T13_48_07.624647", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T13-48-07.624647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T13-48-07.624647.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_14T13_48_07.624647", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T13-48-07.624647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T13-48-07.624647.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_14T13_48_07.624647", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T13-48-07.624647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T13-48-07.624647.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_14T13_48_07.624647", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T13-48-07.624647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T13-48-07.624647.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_14T13_48_07.624647", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T13-48-07.624647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T13-48-07.624647.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_14T13_48_07.624647", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T13-48-07.624647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T13-48-07.624647.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_14T13_48_07.624647", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T13-48-07.624647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T13-48-07.624647.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_14T13_48_07.624647", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T13-48-07.624647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T13-48-07.624647.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_14T13_48_07.624647", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T13-48-07.624647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T13-48-07.624647.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_14T13_48_07.624647", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T13-48-07.624647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T13-48-07.624647.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_14T13_48_07.624647", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T13-48-07.624647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T13-48-07.624647.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_14T13_48_07.624647", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T13-48-07.624647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T13-48-07.624647.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_14T13_48_07.624647", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T13-48-07.624647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T13-48-07.624647.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_14T13_48_07.624647", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T13-48-07.624647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T13-48-07.624647.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_14T13_48_07.624647", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T13-48-07.624647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T13-48-07.624647.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_14T13_48_07.624647", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T13-48-07.624647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T13-48-07.624647.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_14T13_48_07.624647", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T13-48-07.624647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T13-48-07.624647.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_14T13_48_07.624647", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T13-48-07.624647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T13-48-07.624647.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_14T13_48_07.624647", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T13-48-07.624647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T13-48-07.624647.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_14T13_48_07.624647", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T13-48-07.624647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T13-48-07.624647.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_14T13_48_07.624647", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T13-48-07.624647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T13-48-07.624647.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_14T13_48_07.624647", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T13-48-07.624647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T13-48-07.624647.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_14T13_48_07.624647", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T13-48-07.624647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T13-48-07.624647.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_14T13_48_07.624647", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T13-48-07.624647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T13-48-07.624647.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_14T13_48_07.624647", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T13-48-07.624647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T13-48-07.624647.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_14T13_48_07.624647", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T13-48-07.624647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T13-48-07.624647.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_14T13_48_07.624647", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T13-48-07.624647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T13-48-07.624647.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_14T13_48_07.624647", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T13-48-07.624647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T13-48-07.624647.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_14T13_48_07.624647", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T13-48-07.624647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T13-48-07.624647.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_14T13_48_07.624647", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T13-48-07.624647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T13-48-07.624647.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_14T13_48_07.624647", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T13-48-07.624647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T13-48-07.624647.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_14T13_48_07.624647", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T13-48-07.624647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T13-48-07.624647.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_14T13_48_07.624647", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T13-48-07.624647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T13-48-07.624647.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_14T13_48_07.624647", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T13-48-07.624647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T13-48-07.624647.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_14T13_48_07.624647", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T13-48-07.624647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T13-48-07.624647.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_14T13_48_07.624647", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T13-48-07.624647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T13-48-07.624647.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_14T13_48_07.624647", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T13-48-07.624647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T13-48-07.624647.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_14T13_48_07.624647", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T13-48-07.624647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T13-48-07.624647.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_14T13_48_07.624647", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T13-48-07.624647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T13-48-07.624647.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_14T13_48_07.624647", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T13-48-07.624647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T13-48-07.624647.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_14T13_48_07.624647", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T13-48-07.624647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T13-48-07.624647.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_14T13_48_07.624647", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T13-48-07.624647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T13-48-07.624647.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_14T13_48_07.624647", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T13-48-07.624647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T13-48-07.624647.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_14T13_48_07.624647", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T13-48-07.624647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T13-48-07.624647.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_14T13_48_07.624647", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T13-48-07.624647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T13-48-07.624647.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_14T13_48_07.624647", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T13-48-07.624647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T13-48-07.624647.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_14T13_48_07.624647", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T13-48-07.624647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T13-48-07.624647.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_14T13_48_07.624647", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T13-48-07.624647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T13-48-07.624647.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_14T13_48_07.624647", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T13-48-07.624647.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T13-48-07.624647.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_14T13_48_07.624647", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T13-48-07.624647.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T13-48-07.624647.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_14T13_48_07.624647", "path": ["**/details_harness|winogrande|5_2024-01-14T13-48-07.624647.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-14T13-48-07.624647.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_14T13_48_07.624647", "path": ["results_2024-01-14T13-48-07.624647.parquet"]}, {"split": "latest", "path": ["results_2024-01-14T13-48-07.624647.parquet"]}]}]} | 2024-01-14T13:50:49+00:00 |
6740f433dd642d55cb44d91d69d09bd4f0bb3e5a |
# Dataset of asuka/飛鳥/飞鸟 (Azur Lane)
This is the dataset of asuka/飛鳥/飞鸟 (Azur Lane), containing 368 images and their tags.
The core tags of this character are `breasts, ponytail, brown_eyes, ribbon, large_breasts, hair_ribbon, black_hair, brown_hair, white_ribbon`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 368 | 446.06 MiB | [Download](https://huggingface.co/datasets/CyberHarem/asuka_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 368 | 270.39 MiB | [Download](https://huggingface.co/datasets/CyberHarem/asuka_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 875 | 566.59 MiB | [Download](https://huggingface.co/datasets/CyberHarem/asuka_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 368 | 396.89 MiB | [Download](https://huggingface.co/datasets/CyberHarem/asuka_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 875 | 781.51 MiB | [Download](https://huggingface.co/datasets/CyberHarem/asuka_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/asuka_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, looking_at_viewer, solo, cleavage, open_mouth, :d, blush, striped_bikini, navel, red_scarf, simple_background, white_background |
| 1 | 10 |  |  |  |  |  | 1girl, navel, solo, striped_bikini, cleavage, front-tie_top, looking_at_viewer, blush, side-tie_bikini_bottom, multicolored_stripes, open_mouth, red_scarf, white_background, smile, multicolored_clothes, simple_background |
| 2 | 9 |  |  |  |  |  | cleavage, looking_at_viewer, 1girl, cloud, day, open_mouth, outdoors, solo, blue_sky, beach, navel, side-tie_bikini_bottom, smile, ocean, striped_bikini, blush, long_hair |
| 3 | 30 |  |  |  |  |  | school_uniform, 1girl, solo, sweater_vest, black_thighhighs, dual_wielding, plaid_skirt, red_scarf, katana, necktie, smile, looking_at_viewer, blush |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | cleavage | open_mouth | :d | blush | striped_bikini | navel | red_scarf | simple_background | white_background | front-tie_top | side-tie_bikini_bottom | multicolored_stripes | smile | multicolored_clothes | cloud | day | outdoors | blue_sky | beach | ocean | long_hair | school_uniform | sweater_vest | black_thighhighs | dual_wielding | plaid_skirt | katana | necktie |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:-----------|:-------------|:-----|:--------|:-----------------|:--------|:------------|:--------------------|:-------------------|:----------------|:-------------------------|:-----------------------|:--------|:-----------------------|:--------|:------|:-----------|:-----------|:--------|:--------|:------------|:-----------------|:---------------|:-------------------|:----------------|:--------------|:---------|:----------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | |
| 1 | 10 |  |  |  |  |  | X | X | X | X | X | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | |
| 2 | 9 |  |  |  |  |  | X | X | X | X | X | | X | X | X | | | | | X | | X | | X | X | X | X | X | X | X | | | | | | | |
| 3 | 30 |  |  |  |  |  | X | X | X | | | | X | | | X | | | | | | X | | | | | | | | | X | X | X | X | X | X | X |
| CyberHarem/asuka_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T13:51:17+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T15:01:19+00:00 |
ec1757bb78afdf083e9175c125081462387cb460 | ouasdg/pedia | [
"region:us"
] | 2024-01-14T13:52:38+00:00 | {} | 2024-02-16T03:48:42+00:00 |
|
93206bbb22295ec1bbb26b67ceb3b225cb2387b9 | longevity-genie/all_pubmed | [
"license:apache-2.0",
"region:us"
] | 2024-01-14T13:58:20+00:00 | {"license": "apache-2.0"} | 2024-01-14T13:58:20+00:00 |
|
f8c198f1bb9a37e9c683b9a6f96d37fa5de91e79 | # Dataset Card
- **Homepage: https://kaistai.github.io/prometheus-vision/**
- **Repository: https://github.com/kaistAI/prometheus-vision**
- **Paper: https://arxiv.org/abs/2401.06591**
- **Point of Contact: [email protected]**
### Dataset summary
Perception Collection is the first multi-modal feedback dataset that could be used to train an evaluator VLM. Perception Collection includes 15K fine-grained criteria that determine the crucial aspect for each instance.

### Languages
English
## Dataset Structure
* image: The path of the images used for training, consisting of images from the MMMU dataset and COCO 2017 train dataset.
* instruction: The input that is given to the evaluator VLM. It includes the instruction & response to evaluate, the reference answer, the score rubric.
* output: The output that the evaluator VLM should generate. It includes the feedback and score decision divided by a phrase ```[RESULT]```.
* orig```_```instruction: The instruction to be evaluated. Note that this differs with the instruction that includes all the components.
* orig```_```response: The response to be evaluated.
* orig```_```reference```_```answer: A reference answer to the orig```_```instruction.
* orig```_```criteria: The score criteria used to evaluate the orig```_``` response.
* orig```_```score1```_```description: A description of when to give a score of 1 to the orig```_```response.
* orig```_```score2```_```description: A description of when to give a score of 2 to the orig```_```response.
* orig```_```score3```_```description: A description of when to give a score of 3 to the orig```_```response.
* orig```_```score4```_```description: A description of when to give a score of 4 to the orig```_```response.
* orig```_```score5```_```description: A description of when to give a score of 5 to the orig```_```response.
* orig```_```feedback: A feedback that critiques the orig```_```response.
* orig```_```score: An integer between 1 and 5 given to the orig```_```response.
In our paper, we trained the input using the following prompt format (already processed in the 'instruction'):
```
###Task Description:
An instruction (might include an Input inside it), a response to evaluate, a reference answer that gets a score of 5, image and a score rubric representing an evaluation criterion is given.
1. Write a detailed feedback that assess the quality of the response strictly based on the given score rubric, not evaluating in general.
2. After writing a feedback, write a score that is an integer between 1 and 5. You should refer to the score rubric.
3. The output format should look as follows: \"Feedback: (write a feedback for criteria) [RESULT] (an integer number between 1 and 5)\"
4. Please do not generate any other opening, closing, and explanations.
###The instruction to evaluate:
{orig_instruction}
###Response to evaluate:
{orig_response}
###Reference Answer (Score 5):
{orig_reference_answer}
###Score Rubrics:
[{orig_criteria}]
Score 1: {orig_score1_description}
Score 2: {orig_score2_description}
Score 3: {orig_score3_description}
Score 4: {orig_score4_description}
Score 5: {orig_score5_description}
###Feedback:
```
### Data Splits
| name | train |
|-------------------|------:|
|Perception-Collection|150,108|
### Citation Information
If you find the following dataset helpful, please consider citing our paper!
```bibtex
@misc{lee2024prometheusvision,
title={Prometheus-Vision: Vision-Language Model as a Judge for Fine-Grained Evaluation},
author={Seongyun Lee and Seungone Kim and Sue Hyun Park and Geewook Kim and Minjoon Seo},
year={2024},
eprint={2401.06591},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` | kaist-ai/Perception-Collection | [
"task_categories:visual-question-answering",
"task_categories:text2text-generation",
"task_categories:image-to-text",
"size_categories:100K<n<1M",
"language:en",
"license:cc-by-4.0",
"arxiv:2401.06591",
"region:us"
] | 2024-01-14T14:05:16+00:00 | {"language": ["en"], "license": "cc-by-4.0", "size_categories": ["100K<n<1M"], "task_categories": ["visual-question-answering", "text2text-generation", "image-to-text"]} | 2024-01-15T12:52:11+00:00 |
1c00677a300f564902947cd649714b7cfb8fa6f1 | nirdrang/anthro-ai | [
"license:apache-2.0",
"region:us"
] | 2024-01-14T14:06:33+00:00 | {"license": "apache-2.0"} | 2024-02-05T14:53:37+00:00 |
|
288a3244f0e58d97a7b3cc9f33ff420dfebb6588 |
# Dataset of constellation/コンステレーション/星座 (Azur Lane)
This is the dataset of constellation/コンステレーション/星座 (Azur Lane), containing 15 images and their tags.
The core tags of this character are `long_hair, blue_hair, breasts, large_breasts, very_long_hair, animal_ears, hair_between_eyes, blue_eyes, bangs, fake_animal_ears, purple_eyes, rabbit_ears`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 15 | 23.25 MiB | [Download](https://huggingface.co/datasets/CyberHarem/constellation_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 15 | 11.62 MiB | [Download](https://huggingface.co/datasets/CyberHarem/constellation_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 37 | 24.18 MiB | [Download](https://huggingface.co/datasets/CyberHarem/constellation_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 15 | 20.07 MiB | [Download](https://huggingface.co/datasets/CyberHarem/constellation_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 37 | 35.28 MiB | [Download](https://huggingface.co/datasets/CyberHarem/constellation_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/constellation_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 7 |  |  |  |  |  | 1girl, looking_at_viewer, playboy_bunny, solo, wrist_cuffs, bare_shoulders, thigh_strap, white_leotard, white_pantyhose, cleavage, detached_collar, official_alternate_costume, strapless_leotard, full_body, navel_cutout, simple_background, smile, blush, bowtie, closed_mouth, highleg_leotard, holding, light_blue_hair, sitting, standing |
| 1 | 5 |  |  |  |  |  | 1girl, bare_shoulders, halo, looking_at_viewer, solo, white_dress, full_body, no_shoes, sideboob, elbow_gloves, from_side, simple_background, white_background, ass, blush, cleavage, feet, knee_up, legs, light_blue_hair, parted_lips, sitting, stirrup_legwear, toes, white_gloves, white_thighhighs |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | playboy_bunny | solo | wrist_cuffs | bare_shoulders | thigh_strap | white_leotard | white_pantyhose | cleavage | detached_collar | official_alternate_costume | strapless_leotard | full_body | navel_cutout | simple_background | smile | blush | bowtie | closed_mouth | highleg_leotard | holding | light_blue_hair | sitting | standing | halo | white_dress | no_shoes | sideboob | elbow_gloves | from_side | white_background | ass | feet | knee_up | legs | parted_lips | stirrup_legwear | toes | white_gloves | white_thighhighs |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:----------------|:-------|:--------------|:-----------------|:--------------|:----------------|:------------------|:-----------|:------------------|:-----------------------------|:--------------------|:------------|:---------------|:--------------------|:--------|:--------|:---------|:---------------|:------------------|:----------|:------------------|:----------|:-----------|:-------|:--------------|:-----------|:-----------|:---------------|:------------|:-------------------|:------|:-------|:----------|:-------|:--------------|:------------------|:-------|:---------------|:-------------------|
| 0 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | | X | | X | | | | X | | | | X | | X | | X | | | | | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/constellation_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T14:07:46+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T14:11:35+00:00 |
a54ac9aab90f371085615a38026d4681fa128b0b |
# Dataset of eagle/イーグル/鹰 (Azur Lane)
This is the dataset of eagle/イーグル/鹰 (Azur Lane), containing 10 images and their tags.
The core tags of this character are `breasts, bangs, large_breasts, long_hair, hairband, grey_hair, yellow_eyes, braid`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 10 | 14.11 MiB | [Download](https://huggingface.co/datasets/CyberHarem/eagle_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 10 | 7.80 MiB | [Download](https://huggingface.co/datasets/CyberHarem/eagle_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 24 | 16.70 MiB | [Download](https://huggingface.co/datasets/CyberHarem/eagle_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 10 | 12.40 MiB | [Download](https://huggingface.co/datasets/CyberHarem/eagle_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 24 | 24.01 MiB | [Download](https://huggingface.co/datasets/CyberHarem/eagle_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/eagle_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 |  |  |  |  |  | 1girl, cleavage, looking_at_viewer, solo, black_bra, black_pantyhose, closed_mouth, pencil_skirt, white_shirt, black_skirt, holding, necklace, bra_peek, collarbone, miniskirt, sitting |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | cleavage | looking_at_viewer | solo | black_bra | black_pantyhose | closed_mouth | pencil_skirt | white_shirt | black_skirt | holding | necklace | bra_peek | collarbone | miniskirt | sitting |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------|:--------------------|:-------|:------------|:------------------|:---------------|:---------------|:--------------|:--------------|:----------|:-----------|:-----------|:-------------|:------------|:----------|
| 0 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/eagle_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T14:07:53+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T14:11:12+00:00 |
cf9ee7fe2837dc9340f7647f3c07c46a5656a8d5 |
# Dataset of curlew/カーリュー/杓鹬 (Azur Lane)
This is the dataset of curlew/カーリュー/杓鹬 (Azur Lane), containing 10 images and their tags.
The core tags of this character are `bangs, blue_eyes, braid, breasts, long_hair, purple_hair, large_breasts, bow, hair_bow, maid_headdress, single_braid, sidelocks`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 10 | 13.72 MiB | [Download](https://huggingface.co/datasets/CyberHarem/curlew_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 10 | 9.69 MiB | [Download](https://huggingface.co/datasets/CyberHarem/curlew_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 17 | 14.24 MiB | [Download](https://huggingface.co/datasets/CyberHarem/curlew_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 10 | 12.72 MiB | [Download](https://huggingface.co/datasets/CyberHarem/curlew_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 17 | 17.50 MiB | [Download](https://huggingface.co/datasets/CyberHarem/curlew_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/curlew_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 |  |  |  |  |  | looking_at_viewer, juliet_sleeves, maid_apron, frills, 1girl, cannon, full_body, holding, machinery, parted_lips, rigging, shoes, sitting, turret, white_apron, 2girls, black_dress, black_footwear, chair, cleavage, flower, indoors, petals, solo_focus |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | looking_at_viewer | juliet_sleeves | maid_apron | frills | 1girl | cannon | full_body | holding | machinery | parted_lips | rigging | shoes | sitting | turret | white_apron | 2girls | black_dress | black_footwear | chair | cleavage | flower | indoors | petals | solo_focus |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------|:-----------------|:-------------|:---------|:--------|:---------|:------------|:----------|:------------|:--------------|:----------|:--------|:----------|:---------|:--------------|:---------|:--------------|:-----------------|:--------|:-----------|:---------|:----------|:---------|:-------------|
| 0 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/curlew_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T14:07:55+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T14:12:03+00:00 |
ac7fb05c8b23d90217daefac02dec435d9994358 | # Dataset Card
- **Homepage: https://kaistai.github.io/prometheus-vision/**
- **Repository: https://github.com/kaistAI/prometheus-vision**
- **Paper: https://arxiv.org/abs/2401.06591**
- **Point of Contact: [email protected]**
### Dataset summary
Perception-Bench is a benchmark for evaluating the long-form response of a VLM (Vision Language Model) across various domains of images, and it is a held-out test
set of the [Perception-Collection](https://huggingface.co/datasets/kaist-ai/Perception-Collection)

### Languages
English
## Dataset Structure
* image: The path of the images used for training, consisting of images from the MMMU dataset and COCO 2017 train dataset.
* instruction: The input that is given to the evaluator VLM. It includes the instruction & response to evaluate, the reference answer, the score rubric.
* orig```_```instruction: The instruction to be evaluated. Note that this differs with the instruction that includes all the components.
* orig```_```reference```_```answer: A reference answer to the orig```_```instruction.
* orig```_```criteria: The score criteria used to evaluate the orig```_``` response.
* orig```_```score1```_```description: A description of when to give a score of 1 to the orig```_```response.
* orig```_```score2```_```description: A description of when to give a score of 2 to the orig```_```response.
* orig```_```score3```_```description: A description of when to give a score of 3 to the orig```_```response.
* orig```_```score4```_```description: A description of when to give a score of 4 to the orig```_```response.
* orig```_```score5```_```description: A description of when to give a score of 5 to the orig```_```response.
### Data Splits
| name | test |
|-------------------|------:|
|Perception-Bench|500|
### Citation Information
If you find the following benchmark helpful, please consider citing our paper!
```bibtex
@misc{lee2024prometheusvision,
title={Prometheus-Vision: Vision-Language Model as a Judge for Fine-Grained Evaluation},
author={Seongyun Lee and Seungone Kim and Sue Hyun Park and Geewook Kim and Minjoon Seo},
year={2024},
eprint={2401.06591},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` | kaist-ai/Perception-Bench | [
"task_categories:visual-question-answering",
"task_categories:text2text-generation",
"task_categories:image-to-text",
"size_categories:n<1K",
"language:en",
"license:cc-by-4.0",
"arxiv:2401.06591",
"region:us"
] | 2024-01-14T14:09:06+00:00 | {"language": ["en"], "license": "cc-by-4.0", "size_categories": ["n<1K"], "task_categories": ["visual-question-answering", "text2text-generation", "image-to-text"]} | 2024-01-15T14:25:01+00:00 |
1aee537a81dc6e338938d914e066d4beb580b897 | Alignment-Lab-AI/asd | [
"region:us"
] | 2024-01-14T14:11:10+00:00 | {} | 2024-01-14T17:16:37+00:00 |
|
b91ad2a422ff9a892c355640a8987e774813d36f | Idor980/pantheon-data-sample | [
"region:us"
] | 2024-01-14T14:20:27+00:00 | {} | 2024-01-19T11:04:56+00:00 |
|
6b50b29008545fb4d4c5729e00da18a196b6d2fd |
# Dataset Card for Evaluation run of PSanni/MPOMixtral-8x7B-Instruct-v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [PSanni/MPOMixtral-8x7B-Instruct-v0.1](https://huggingface.co/PSanni/MPOMixtral-8x7B-Instruct-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_PSanni__MPOMixtral-8x7B-Instruct-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T14:23:30.207507](https://huggingface.co/datasets/open-llm-leaderboard/details_PSanni__MPOMixtral-8x7B-Instruct-v0.1/blob/main/results_2024-01-14T14-23-30.207507.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7021378898937154,
"acc_stderr": 0.03050401556178155,
"acc_norm": 0.7057392541812927,
"acc_norm_stderr": 0.031097861836160572,
"mc1": 0.5152998776009792,
"mc1_stderr": 0.017495304473187902,
"mc2": 0.665216588266765,
"mc2_stderr": 0.014619883028401507
},
"harness|arc:challenge|25": {
"acc": 0.6800341296928327,
"acc_stderr": 0.013631345807016193,
"acc_norm": 0.7098976109215017,
"acc_norm_stderr": 0.013261573677520764
},
"harness|hellaswag|10": {
"acc": 0.690300736904999,
"acc_stderr": 0.004614246282055377,
"acc_norm": 0.8795060744871539,
"acc_norm_stderr": 0.0032487292211528865
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6888888888888889,
"acc_stderr": 0.0399926287661772,
"acc_norm": 0.6888888888888889,
"acc_norm_stderr": 0.0399926287661772
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7960526315789473,
"acc_stderr": 0.0327900040631005,
"acc_norm": 0.7960526315789473,
"acc_norm_stderr": 0.0327900040631005
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7622641509433963,
"acc_stderr": 0.02619980880756193,
"acc_norm": 0.7622641509433963,
"acc_norm_stderr": 0.02619980880756193
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.03309615177059006,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.03309615177059006
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.033450369167889904,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.033450369167889904
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6851063829787234,
"acc_stderr": 0.03036358219723817,
"acc_norm": 0.6851063829787234,
"acc_norm_stderr": 0.03036358219723817
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5877192982456141,
"acc_stderr": 0.04630653203366596,
"acc_norm": 0.5877192982456141,
"acc_norm_stderr": 0.04630653203366596
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6275862068965518,
"acc_stderr": 0.04028731532947558,
"acc_norm": 0.6275862068965518,
"acc_norm_stderr": 0.04028731532947558
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.47354497354497355,
"acc_stderr": 0.025715239811346758,
"acc_norm": 0.47354497354497355,
"acc_norm_stderr": 0.025715239811346758
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8387096774193549,
"acc_stderr": 0.020923327006423294,
"acc_norm": 0.8387096774193549,
"acc_norm_stderr": 0.020923327006423294
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5960591133004927,
"acc_stderr": 0.03452453903822033,
"acc_norm": 0.5960591133004927,
"acc_norm_stderr": 0.03452453903822033
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8585858585858586,
"acc_stderr": 0.024825909793343336,
"acc_norm": 0.8585858585858586,
"acc_norm_stderr": 0.024825909793343336
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9585492227979274,
"acc_stderr": 0.01438543285747646,
"acc_norm": 0.9585492227979274,
"acc_norm_stderr": 0.01438543285747646
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7205128205128205,
"acc_stderr": 0.022752388839776823,
"acc_norm": 0.7205128205128205,
"acc_norm_stderr": 0.022752388839776823
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.029869605095316908,
"acc_norm": 0.4,
"acc_norm_stderr": 0.029869605095316908
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.773109243697479,
"acc_stderr": 0.027205371538279472,
"acc_norm": 0.773109243697479,
"acc_norm_stderr": 0.027205371538279472
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4768211920529801,
"acc_stderr": 0.04078093859163083,
"acc_norm": 0.4768211920529801,
"acc_norm_stderr": 0.04078093859163083
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8715596330275229,
"acc_stderr": 0.014344977542914318,
"acc_norm": 0.8715596330275229,
"acc_norm_stderr": 0.014344977542914318
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5740740740740741,
"acc_stderr": 0.033723432716530624,
"acc_norm": 0.5740740740740741,
"acc_norm_stderr": 0.033723432716530624
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8529411764705882,
"acc_stderr": 0.02485747808025045,
"acc_norm": 0.8529411764705882,
"acc_norm_stderr": 0.02485747808025045
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8776371308016878,
"acc_stderr": 0.02133174182974679,
"acc_norm": 0.8776371308016878,
"acc_norm_stderr": 0.02133174182974679
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7040358744394619,
"acc_stderr": 0.030636591348699813,
"acc_norm": 0.7040358744394619,
"acc_norm_stderr": 0.030636591348699813
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.0349814938546247,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.0349814938546247
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8842975206611571,
"acc_stderr": 0.02919980245562281,
"acc_norm": 0.8842975206611571,
"acc_norm_stderr": 0.02919980245562281
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037182,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037182
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.803680981595092,
"acc_stderr": 0.031207970394709218,
"acc_norm": 0.803680981595092,
"acc_norm_stderr": 0.031207970394709218
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.8543689320388349,
"acc_stderr": 0.034926064766237906,
"acc_norm": 0.8543689320388349,
"acc_norm_stderr": 0.034926064766237906
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.905982905982906,
"acc_stderr": 0.019119892798924974,
"acc_norm": 0.905982905982906,
"acc_norm_stderr": 0.019119892798924974
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8722860791826309,
"acc_stderr": 0.011935626313999878,
"acc_norm": 0.8722860791826309,
"acc_norm_stderr": 0.011935626313999878
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.791907514450867,
"acc_stderr": 0.021855255263421795,
"acc_norm": 0.791907514450867,
"acc_norm_stderr": 0.021855255263421795
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4558659217877095,
"acc_stderr": 0.01665722942458631,
"acc_norm": 0.4558659217877095,
"acc_norm_stderr": 0.01665722942458631
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8006535947712419,
"acc_stderr": 0.02287581699346408,
"acc_norm": 0.8006535947712419,
"acc_norm_stderr": 0.02287581699346408
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7845659163987139,
"acc_stderr": 0.023350225475471442,
"acc_norm": 0.7845659163987139,
"acc_norm_stderr": 0.023350225475471442
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8240740740740741,
"acc_stderr": 0.021185893615225153,
"acc_norm": 0.8240740740740741,
"acc_norm_stderr": 0.021185893615225153
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5106382978723404,
"acc_stderr": 0.02982074719142244,
"acc_norm": 0.5106382978723404,
"acc_norm_stderr": 0.02982074719142244
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5443285528031291,
"acc_stderr": 0.012719949543032226,
"acc_norm": 0.5443285528031291,
"acc_norm_stderr": 0.012719949543032226
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7867647058823529,
"acc_stderr": 0.024880971512294254,
"acc_norm": 0.7867647058823529,
"acc_norm_stderr": 0.024880971512294254
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.01716058723504635,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.01716058723504635
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.043091187099464585,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.043091187099464585
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7836734693877551,
"acc_stderr": 0.026358916334904028,
"acc_norm": 0.7836734693877551,
"acc_norm_stderr": 0.026358916334904028
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8756218905472637,
"acc_stderr": 0.023335401790166323,
"acc_norm": 0.8756218905472637,
"acc_norm_stderr": 0.023335401790166323
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352203,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352203
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8654970760233918,
"acc_stderr": 0.026168221344662297,
"acc_norm": 0.8654970760233918,
"acc_norm_stderr": 0.026168221344662297
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5152998776009792,
"mc1_stderr": 0.017495304473187902,
"mc2": 0.665216588266765,
"mc2_stderr": 0.014619883028401507
},
"harness|winogrande|5": {
"acc": 0.8255722178374112,
"acc_stderr": 0.010665187902498428
},
"harness|gsm8k|5": {
"acc": 0.5852918877937832,
"acc_stderr": 0.013570623842304504
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_PSanni__MPOMixtral-8x7B-Instruct-v0.1 | [
"region:us"
] | 2024-01-14T14:25:44+00:00 | {"pretty_name": "Evaluation run of PSanni/MPOMixtral-8x7B-Instruct-v0.1", "dataset_summary": "Dataset automatically created during the evaluation run of model [PSanni/MPOMixtral-8x7B-Instruct-v0.1](https://huggingface.co/PSanni/MPOMixtral-8x7B-Instruct-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_PSanni__MPOMixtral-8x7B-Instruct-v0.1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-14T14:23:30.207507](https://huggingface.co/datasets/open-llm-leaderboard/details_PSanni__MPOMixtral-8x7B-Instruct-v0.1/blob/main/results_2024-01-14T14-23-30.207507.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7021378898937154,\n \"acc_stderr\": 0.03050401556178155,\n \"acc_norm\": 0.7057392541812927,\n \"acc_norm_stderr\": 0.031097861836160572,\n \"mc1\": 0.5152998776009792,\n \"mc1_stderr\": 0.017495304473187902,\n \"mc2\": 0.665216588266765,\n \"mc2_stderr\": 0.014619883028401507\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6800341296928327,\n \"acc_stderr\": 0.013631345807016193,\n \"acc_norm\": 0.7098976109215017,\n \"acc_norm_stderr\": 0.013261573677520764\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.690300736904999,\n \"acc_stderr\": 0.004614246282055377,\n \"acc_norm\": 0.8795060744871539,\n \"acc_norm_stderr\": 0.0032487292211528865\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6888888888888889,\n \"acc_stderr\": 0.0399926287661772,\n \"acc_norm\": 0.6888888888888889,\n \"acc_norm_stderr\": 0.0399926287661772\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7960526315789473,\n \"acc_stderr\": 0.0327900040631005,\n \"acc_norm\": 0.7960526315789473,\n \"acc_norm_stderr\": 0.0327900040631005\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7622641509433963,\n \"acc_stderr\": 0.02619980880756193,\n \"acc_norm\": 0.7622641509433963,\n \"acc_norm_stderr\": 0.02619980880756193\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.03309615177059006,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.03309615177059006\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.033450369167889904,\n \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.033450369167889904\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6851063829787234,\n \"acc_stderr\": 0.03036358219723817,\n \"acc_norm\": 0.6851063829787234,\n \"acc_norm_stderr\": 0.03036358219723817\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5877192982456141,\n \"acc_stderr\": 0.04630653203366596,\n \"acc_norm\": 0.5877192982456141,\n \"acc_norm_stderr\": 0.04630653203366596\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6275862068965518,\n \"acc_stderr\": 0.04028731532947558,\n \"acc_norm\": 0.6275862068965518,\n \"acc_norm_stderr\": 0.04028731532947558\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.47354497354497355,\n \"acc_stderr\": 0.025715239811346758,\n \"acc_norm\": 0.47354497354497355,\n \"acc_norm_stderr\": 0.025715239811346758\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8387096774193549,\n \"acc_stderr\": 0.020923327006423294,\n \"acc_norm\": 0.8387096774193549,\n \"acc_norm_stderr\": 0.020923327006423294\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5960591133004927,\n \"acc_stderr\": 0.03452453903822033,\n \"acc_norm\": 0.5960591133004927,\n \"acc_norm_stderr\": 0.03452453903822033\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8585858585858586,\n \"acc_stderr\": 0.024825909793343336,\n \"acc_norm\": 0.8585858585858586,\n \"acc_norm_stderr\": 0.024825909793343336\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9585492227979274,\n \"acc_stderr\": 0.01438543285747646,\n \"acc_norm\": 0.9585492227979274,\n \"acc_norm_stderr\": 0.01438543285747646\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.7205128205128205,\n \"acc_stderr\": 0.022752388839776823,\n \"acc_norm\": 0.7205128205128205,\n \"acc_norm_stderr\": 0.022752388839776823\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.029869605095316908,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.029869605095316908\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.773109243697479,\n \"acc_stderr\": 0.027205371538279472,\n \"acc_norm\": 0.773109243697479,\n \"acc_norm_stderr\": 0.027205371538279472\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.4768211920529801,\n \"acc_stderr\": 0.04078093859163083,\n \"acc_norm\": 0.4768211920529801,\n \"acc_norm_stderr\": 0.04078093859163083\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8715596330275229,\n \"acc_stderr\": 0.014344977542914318,\n \"acc_norm\": 0.8715596330275229,\n \"acc_norm_stderr\": 0.014344977542914318\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5740740740740741,\n \"acc_stderr\": 0.033723432716530624,\n \"acc_norm\": 0.5740740740740741,\n \"acc_norm_stderr\": 0.033723432716530624\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8529411764705882,\n \"acc_stderr\": 0.02485747808025045,\n \"acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.02485747808025045\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8776371308016878,\n \"acc_stderr\": 0.02133174182974679,\n \"acc_norm\": 0.8776371308016878,\n \"acc_norm_stderr\": 0.02133174182974679\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n \"acc_stderr\": 0.030636591348699813,\n \"acc_norm\": 0.7040358744394619,\n \"acc_norm_stderr\": 0.030636591348699813\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.0349814938546247,\n \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.0349814938546247\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8842975206611571,\n \"acc_stderr\": 0.02919980245562281,\n \"acc_norm\": 0.8842975206611571,\n \"acc_norm_stderr\": 0.02919980245562281\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n \"acc_stderr\": 0.03755265865037182,\n \"acc_norm\": 0.8148148148148148,\n \"acc_norm_stderr\": 0.03755265865037182\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.803680981595092,\n \"acc_stderr\": 0.031207970394709218,\n \"acc_norm\": 0.803680981595092,\n \"acc_norm_stderr\": 0.031207970394709218\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8543689320388349,\n \"acc_stderr\": 0.034926064766237906,\n \"acc_norm\": 0.8543689320388349,\n \"acc_norm_stderr\": 0.034926064766237906\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.905982905982906,\n \"acc_stderr\": 0.019119892798924974,\n \"acc_norm\": 0.905982905982906,\n \"acc_norm_stderr\": 0.019119892798924974\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8722860791826309,\n \"acc_stderr\": 0.011935626313999878,\n \"acc_norm\": 0.8722860791826309,\n \"acc_norm_stderr\": 0.011935626313999878\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.791907514450867,\n \"acc_stderr\": 0.021855255263421795,\n \"acc_norm\": 0.791907514450867,\n \"acc_norm_stderr\": 0.021855255263421795\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4558659217877095,\n \"acc_stderr\": 0.01665722942458631,\n \"acc_norm\": 0.4558659217877095,\n \"acc_norm_stderr\": 0.01665722942458631\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8006535947712419,\n \"acc_stderr\": 0.02287581699346408,\n \"acc_norm\": 0.8006535947712419,\n \"acc_norm_stderr\": 0.02287581699346408\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7845659163987139,\n \"acc_stderr\": 0.023350225475471442,\n \"acc_norm\": 0.7845659163987139,\n \"acc_norm_stderr\": 0.023350225475471442\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8240740740740741,\n \"acc_stderr\": 0.021185893615225153,\n \"acc_norm\": 0.8240740740740741,\n \"acc_norm_stderr\": 0.021185893615225153\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5106382978723404,\n \"acc_stderr\": 0.02982074719142244,\n \"acc_norm\": 0.5106382978723404,\n \"acc_norm_stderr\": 0.02982074719142244\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5443285528031291,\n \"acc_stderr\": 0.012719949543032226,\n \"acc_norm\": 0.5443285528031291,\n \"acc_norm_stderr\": 0.012719949543032226\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7867647058823529,\n \"acc_stderr\": 0.024880971512294254,\n \"acc_norm\": 0.7867647058823529,\n \"acc_norm_stderr\": 0.024880971512294254\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.01716058723504635,\n \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.01716058723504635\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.7181818181818181,\n \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7836734693877551,\n \"acc_stderr\": 0.026358916334904028,\n \"acc_norm\": 0.7836734693877551,\n \"acc_norm_stderr\": 0.026358916334904028\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8756218905472637,\n \"acc_stderr\": 0.023335401790166323,\n \"acc_norm\": 0.8756218905472637,\n \"acc_norm_stderr\": 0.023335401790166323\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8654970760233918,\n \"acc_stderr\": 0.026168221344662297,\n \"acc_norm\": 0.8654970760233918,\n \"acc_norm_stderr\": 0.026168221344662297\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5152998776009792,\n \"mc1_stderr\": 0.017495304473187902,\n \"mc2\": 0.665216588266765,\n \"mc2_stderr\": 0.014619883028401507\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8255722178374112,\n \"acc_stderr\": 0.010665187902498428\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5852918877937832,\n \"acc_stderr\": 0.013570623842304504\n }\n}\n```", "repo_url": "https://huggingface.co/PSanni/MPOMixtral-8x7B-Instruct-v0.1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_14T14_23_30.207507", "path": ["**/details_harness|arc:challenge|25_2024-01-14T14-23-30.207507.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-14T14-23-30.207507.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_14T14_23_30.207507", "path": ["**/details_harness|gsm8k|5_2024-01-14T14-23-30.207507.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-14T14-23-30.207507.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_14T14_23_30.207507", "path": ["**/details_harness|hellaswag|10_2024-01-14T14-23-30.207507.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-14T14-23-30.207507.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_14T14_23_30.207507", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T14-23-30.207507.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T14-23-30.207507.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T14-23-30.207507.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_14T14_23_30.207507", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T14-23-30.207507.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T14-23-30.207507.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_14T14_23_30.207507", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T14-23-30.207507.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T14-23-30.207507.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_14T14_23_30.207507", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T14-23-30.207507.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T14-23-30.207507.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_14T14_23_30.207507", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T14-23-30.207507.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T14-23-30.207507.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_14T14_23_30.207507", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T14-23-30.207507.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T14-23-30.207507.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_14T14_23_30.207507", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T14-23-30.207507.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T14-23-30.207507.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_14T14_23_30.207507", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T14-23-30.207507.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T14-23-30.207507.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_14T14_23_30.207507", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T14-23-30.207507.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T14-23-30.207507.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_14T14_23_30.207507", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T14-23-30.207507.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T14-23-30.207507.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_14T14_23_30.207507", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T14-23-30.207507.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T14-23-30.207507.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_14T14_23_30.207507", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T14-23-30.207507.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T14-23-30.207507.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_14T14_23_30.207507", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T14-23-30.207507.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T14-23-30.207507.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_14T14_23_30.207507", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T14-23-30.207507.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T14-23-30.207507.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_14T14_23_30.207507", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T14-23-30.207507.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T14-23-30.207507.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_14T14_23_30.207507", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T14-23-30.207507.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T14-23-30.207507.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_14T14_23_30.207507", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T14-23-30.207507.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T14-23-30.207507.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_14T14_23_30.207507", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T14-23-30.207507.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T14-23-30.207507.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_14T14_23_30.207507", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T14-23-30.207507.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T14-23-30.207507.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_14T14_23_30.207507", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T14-23-30.207507.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T14-23-30.207507.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_14T14_23_30.207507", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T14-23-30.207507.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T14-23-30.207507.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_14T14_23_30.207507", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T14-23-30.207507.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T14-23-30.207507.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_14T14_23_30.207507", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T14-23-30.207507.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T14-23-30.207507.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_14T14_23_30.207507", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T14-23-30.207507.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T14-23-30.207507.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_14T14_23_30.207507", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T14-23-30.207507.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T14-23-30.207507.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_14T14_23_30.207507", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T14-23-30.207507.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T14-23-30.207507.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_14T14_23_30.207507", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T14-23-30.207507.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T14-23-30.207507.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_14T14_23_30.207507", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T14-23-30.207507.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T14-23-30.207507.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_14T14_23_30.207507", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T14-23-30.207507.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T14-23-30.207507.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_14T14_23_30.207507", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T14-23-30.207507.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T14-23-30.207507.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_14T14_23_30.207507", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T14-23-30.207507.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T14-23-30.207507.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_14T14_23_30.207507", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T14-23-30.207507.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T14-23-30.207507.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_14T14_23_30.207507", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T14-23-30.207507.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T14-23-30.207507.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_14T14_23_30.207507", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T14-23-30.207507.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T14-23-30.207507.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_14T14_23_30.207507", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T14-23-30.207507.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T14-23-30.207507.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_14T14_23_30.207507", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T14-23-30.207507.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T14-23-30.207507.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_14T14_23_30.207507", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T14-23-30.207507.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T14-23-30.207507.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_14T14_23_30.207507", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T14-23-30.207507.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T14-23-30.207507.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_14T14_23_30.207507", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T14-23-30.207507.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T14-23-30.207507.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_14T14_23_30.207507", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T14-23-30.207507.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T14-23-30.207507.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_14T14_23_30.207507", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T14-23-30.207507.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T14-23-30.207507.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_14T14_23_30.207507", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T14-23-30.207507.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T14-23-30.207507.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_14T14_23_30.207507", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T14-23-30.207507.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T14-23-30.207507.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_14T14_23_30.207507", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T14-23-30.207507.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T14-23-30.207507.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_14T14_23_30.207507", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T14-23-30.207507.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T14-23-30.207507.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_14T14_23_30.207507", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T14-23-30.207507.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T14-23-30.207507.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_14T14_23_30.207507", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T14-23-30.207507.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T14-23-30.207507.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_14T14_23_30.207507", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T14-23-30.207507.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T14-23-30.207507.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_14T14_23_30.207507", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T14-23-30.207507.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T14-23-30.207507.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_14T14_23_30.207507", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T14-23-30.207507.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T14-23-30.207507.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_14T14_23_30.207507", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T14-23-30.207507.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T14-23-30.207507.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_14T14_23_30.207507", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T14-23-30.207507.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T14-23-30.207507.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_14T14_23_30.207507", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T14-23-30.207507.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T14-23-30.207507.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_14T14_23_30.207507", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T14-23-30.207507.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T14-23-30.207507.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_14T14_23_30.207507", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T14-23-30.207507.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T14-23-30.207507.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_14T14_23_30.207507", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T14-23-30.207507.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T14-23-30.207507.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_14T14_23_30.207507", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T14-23-30.207507.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T14-23-30.207507.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_14T14_23_30.207507", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T14-23-30.207507.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T14-23-30.207507.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_14T14_23_30.207507", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T14-23-30.207507.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T14-23-30.207507.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_14T14_23_30.207507", "path": ["**/details_harness|winogrande|5_2024-01-14T14-23-30.207507.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-14T14-23-30.207507.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_14T14_23_30.207507", "path": ["results_2024-01-14T14-23-30.207507.parquet"]}, {"split": "latest", "path": ["results_2024-01-14T14-23-30.207507.parquet"]}]}]} | 2024-01-14T14:26:04+00:00 |
b4bf7622fe19b0699aac7cd8af752c81ce098e2d | fuyu-quant/ibl-regression-ver1-mix | [
"region:us"
] | 2024-01-14T14:27:00+00:00 | {"dataset_info": {"features": [{"name": "instruction", "dtype": "string"}, {"name": "output", "dtype": "string"}, {"name": "index", "dtype": "int64"}, {"name": "category", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 43540321, "num_examples": 30000}, {"name": "test", "num_bytes": 1455799, "num_examples": 1000}], "download_size": 23195654, "dataset_size": 44996120}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2024-01-14T14:43:04+00:00 |
|
61a82414fa150f2923ad3515473e79681797fbe2 |
This repository contains importance matrix datasets for use with the improved quantization methods recently added to `llama.cpp`.
The importance matrix has been computed using `wiki.train.raw` as training data.
Hope the file names are self-explanatory.
To use, after cloning this repo, for e.g. Mixtral-8x7B and `Q4_K_M` quantization, use
```
./quantize --imatrix path_to_repo/mixtral-8x7b.imatrix path_to_model ggml-model-q4k-m.gguf Q4_K_M
```
| ikawrakow/imatrix-from-wiki-train | [
"license:apache-2.0",
"region:us"
] | 2024-01-14T14:27:02+00:00 | {"license": "apache-2.0"} | 2024-01-14T15:11:22+00:00 |
2f2332989ebc1a0b262f84f3366d91590b0f9849 |
# Dataset of radford/ラドフォード/拉德福特 (Azur Lane)
This is the dataset of radford/ラドフォード/拉德福特 (Azur Lane), containing 16 images and their tags.
The core tags of this character are `pink_hair, blue_eyes, long_hair, ribbon, bow, hair_bow, hair_ribbon, ponytail, symbol-shaped_pupils`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 16 | 14.24 MiB | [Download](https://huggingface.co/datasets/CyberHarem/radford_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 16 | 9.61 MiB | [Download](https://huggingface.co/datasets/CyberHarem/radford_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 34 | 20.26 MiB | [Download](https://huggingface.co/datasets/CyberHarem/radford_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 16 | 13.29 MiB | [Download](https://huggingface.co/datasets/CyberHarem/radford_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 34 | 26.12 MiB | [Download](https://huggingface.co/datasets/CyberHarem/radford_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/radford_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 16 |  |  |  |  |  | looking_at_viewer, 1girl, blush, solo, skirt, lollipop, navel, midriff, smile, belt, open_mouth, heart, holding, simple_background, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | looking_at_viewer | 1girl | blush | solo | skirt | lollipop | navel | midriff | smile | belt | open_mouth | heart | holding | simple_background | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------|:--------|:--------|:-------|:--------|:-----------|:--------|:----------|:--------|:-------|:-------------|:--------|:----------|:--------------------|:-------------------|
| 0 | 16 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/radford_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T14:29:21+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T14:32:56+00:00 |
373dab56a041184b72b88335a003956b41b294a5 |
# Dataset of san_jacinto/サン・ジャシント/圣哈辛托 (Azur Lane)
This is the dataset of san_jacinto/サン・ジャシント/圣哈辛托 (Azur Lane), containing 17 images and their tags.
The core tags of this character are `breasts, large_breasts, purple_eyes, bangs, short_hair, white_hair, animal_ears, fake_animal_ears, rabbit_ears, bow, tail`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 17 | 26.26 MiB | [Download](https://huggingface.co/datasets/CyberHarem/san_jacinto_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 17 | 13.73 MiB | [Download](https://huggingface.co/datasets/CyberHarem/san_jacinto_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 38 | 26.74 MiB | [Download](https://huggingface.co/datasets/CyberHarem/san_jacinto_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 17 | 22.19 MiB | [Download](https://huggingface.co/datasets/CyberHarem/san_jacinto_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 38 | 40.53 MiB | [Download](https://huggingface.co/datasets/CyberHarem/san_jacinto_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/san_jacinto_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 17 |  |  |  |  |  | 1girl, looking_at_viewer, solo, smile, bare_shoulders, playboy_bunny, black_leotard, pantyhose, blush, sideboob, sitting, closed_mouth, detached_collar, wrist_cuffs |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | smile | bare_shoulders | playboy_bunny | black_leotard | pantyhose | blush | sideboob | sitting | closed_mouth | detached_collar | wrist_cuffs |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:--------|:-----------------|:----------------|:----------------|:------------|:--------|:-----------|:----------|:---------------|:------------------|:--------------|
| 0 | 17 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/san_jacinto_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T14:29:30+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T14:33:13+00:00 |
38a071912aeb70389c5b62757756046b4f46851e | fuyu-quant/ibl-regression-ver1-linear | [
"region:us"
] | 2024-01-14T14:30:08+00:00 | {"dataset_info": {"features": [{"name": "instruction", "dtype": "string"}, {"name": "output", "dtype": "string"}, {"name": "index", "dtype": "int64"}, {"name": "category", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 41345573, "num_examples": 30000}, {"name": "test", "num_bytes": 1379006, "num_examples": 1000}], "download_size": 22480074, "dataset_size": 42724579}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2024-01-14T14:30:13+00:00 |
|
c4c544b37075b7019e39b8ba2d1ad3e439dc0eb6 | talrid/CodeContests_valid_and_test_AlphaCodium | [
"license:apache-2.0",
"region:us"
] | 2024-01-14T14:30:52+00:00 | {"license": "apache-2.0"} | 2024-01-14T14:33:15+00:00 |
|
0c7376e8d184f3c613ccb70c2548c4d031aa21c7 | olemeyer/mnt-instruct-2 | [
"region:us"
] | 2024-01-14T14:30:55+00:00 | {"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "instruction", "dtype": "string"}, {"name": "response", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 974888583, "num_examples": 579298}], "download_size": 533437374, "dataset_size": 974888583}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-14T14:31:21+00:00 |
|
15f59839236255155a8d12d20082d10cb5a5711b |
# Dataset of carabiniere/カラビニエーレ/龙骑兵 (Azur Lane)
This is the dataset of carabiniere/カラビニエーレ/龙骑兵 (Azur Lane), containing 12 images and their tags.
The core tags of this character are `blonde_hair, purple_eyes, breasts, bangs, curly_hair, hair_between_eyes, hat, short_hair, hairband`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 12 | 16.47 MiB | [Download](https://huggingface.co/datasets/CyberHarem/carabiniere_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 12 | 9.15 MiB | [Download](https://huggingface.co/datasets/CyberHarem/carabiniere_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 26 | 18.82 MiB | [Download](https://huggingface.co/datasets/CyberHarem/carabiniere_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 12 | 14.43 MiB | [Download](https://huggingface.co/datasets/CyberHarem/carabiniere_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 26 | 26.38 MiB | [Download](https://huggingface.co/datasets/CyberHarem/carabiniere_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/carabiniere_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, looking_at_viewer, blush, flower, white_dress, holding, solo_focus, umbrella |
| 1 | 6 |  |  |  |  |  | 1girl, epaulettes, long_sleeves, white_gloves, garter_straps, saber_(weapon), solo, chick, closed_mouth, holding, looking_at_viewer, multicolored_cape, sheathed, single_thighhigh, animal_on_head, belt, bicorne, black_cape, black_headwear, black_jacket, black_skirt, blue_headwear, cannon, full_body, gun, knee_boots, turret, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | blush | flower | white_dress | holding | solo_focus | umbrella | epaulettes | long_sleeves | white_gloves | garter_straps | saber_(weapon) | solo | chick | closed_mouth | multicolored_cape | sheathed | single_thighhigh | animal_on_head | belt | bicorne | black_cape | black_headwear | black_jacket | black_skirt | blue_headwear | cannon | full_body | gun | knee_boots | turret | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:--------|:---------|:--------------|:----------|:-------------|:-----------|:-------------|:---------------|:---------------|:----------------|:-----------------|:-------|:--------|:---------------|:--------------------|:-----------|:-------------------|:-----------------|:-------|:----------|:-------------|:-----------------|:---------------|:--------------|:----------------|:---------|:------------|:------|:-------------|:---------|:-------------------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/carabiniere_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T14:38:57+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T14:43:14+00:00 |
439cff26ab630484a5fa7c5386f76ba6dc273175 | valerievloef/Thesis_BERT | [
"license:apache-2.0",
"region:us"
] | 2024-01-14T14:44:22+00:00 | {"license": "apache-2.0"} | 2024-01-14T14:45:21+00:00 |
|
d28172e01deff440b9bc4846a17bd21f7a25b6c3 | fuyu-quant/ibl-regression-ver1-branch | [
"region:us"
] | 2024-01-14T14:44:44+00:00 | {"dataset_info": {"features": [{"name": "instruction", "dtype": "string"}, {"name": "output", "dtype": "string"}, {"name": "index", "dtype": "int64"}, {"name": "category", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 42612428, "num_examples": 30000}, {"name": "test", "num_bytes": 1419385, "num_examples": 1000}], "download_size": 20886073, "dataset_size": 44031813}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2024-01-14T14:44:48+00:00 |
|
9c39c927c1896cf4ad92cf46f05dada5739a887b | fuyu-quant/ibl-regression-ver1-all | [
"region:us"
] | 2024-01-14T14:46:54+00:00 | {"dataset_info": {"features": [{"name": "instruction", "dtype": "string"}, {"name": "output", "dtype": "string"}, {"name": "index", "dtype": "int64"}, {"name": "category", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 42499380, "num_examples": 30000}, {"name": "test", "num_bytes": 1416656, "num_examples": 1000}], "download_size": 22409238, "dataset_size": 43916036}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2024-01-14T14:46:58+00:00 |
|
561f491efe3dd67e079e7a318f65f50712fce8c8 |
# Dataset Card for Evaluation run of andysalerno/openchat-nectar-0.5
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [andysalerno/openchat-nectar-0.5](https://huggingface.co/andysalerno/openchat-nectar-0.5) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_andysalerno__openchat-nectar-0.5",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T14:46:05.051264](https://huggingface.co/datasets/open-llm-leaderboard/details_andysalerno__openchat-nectar-0.5/blob/main/results_2024-01-14T14-46-05.051264.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.656162636449764,
"acc_stderr": 0.03183140048341385,
"acc_norm": 0.6568859855566402,
"acc_norm_stderr": 0.03248609595939316,
"mc1": 0.3574051407588739,
"mc1_stderr": 0.01677659967672941,
"mc2": 0.5215263336816769,
"mc2_stderr": 0.015319650015486606
},
"harness|arc:challenge|25": {
"acc": 0.6305460750853242,
"acc_stderr": 0.014104578366491892,
"acc_norm": 0.6672354948805461,
"acc_norm_stderr": 0.013769863046192309
},
"harness|hellaswag|10": {
"acc": 0.6392152957578172,
"acc_stderr": 0.004792467255899766,
"acc_norm": 0.835291774546903,
"acc_norm_stderr": 0.003701589571274316
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.02804918631569525,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.02804918631569525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5957446808510638,
"acc_stderr": 0.032081157507886836,
"acc_norm": 0.5957446808510638,
"acc_norm_stderr": 0.032081157507886836
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.04692008381368909,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.04692008381368909
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.025424835086924,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.025424835086924
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5238095238095238,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.5238095238095238,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.02328766512726854,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.02328766512726854
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.031922715695483016,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.031922715695483016
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.029376616484945627,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.029376616484945627
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6794871794871795,
"acc_stderr": 0.02366129639396428,
"acc_norm": 0.6794871794871795,
"acc_norm_stderr": 0.02366129639396428
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3851851851851852,
"acc_stderr": 0.02967090612463088,
"acc_norm": 0.3851851851851852,
"acc_norm_stderr": 0.02967090612463088
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6932773109243697,
"acc_stderr": 0.02995382389188704,
"acc_norm": 0.6932773109243697,
"acc_norm_stderr": 0.02995382389188704
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.03879687024073327,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.03879687024073327
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8532110091743119,
"acc_stderr": 0.015173141845126255,
"acc_norm": 0.8532110091743119,
"acc_norm_stderr": 0.015173141845126255
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.02584501798692692,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.02584501798692692
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8185654008438819,
"acc_stderr": 0.025085961144579647,
"acc_norm": 0.8185654008438819,
"acc_norm_stderr": 0.025085961144579647
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7040358744394619,
"acc_stderr": 0.030636591348699824,
"acc_norm": 0.7040358744394619,
"acc_norm_stderr": 0.030636591348699824
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990947,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990947
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092375,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092375
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932262,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932262
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8339719029374202,
"acc_stderr": 0.013306478243066302,
"acc_norm": 0.8339719029374202,
"acc_norm_stderr": 0.013306478243066302
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7543352601156069,
"acc_stderr": 0.023176298203992,
"acc_norm": 0.7543352601156069,
"acc_norm_stderr": 0.023176298203992
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27039106145251396,
"acc_stderr": 0.01485499393801009,
"acc_norm": 0.27039106145251396,
"acc_norm_stderr": 0.01485499393801009
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.02463004897982478,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.02463004897982478
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7561728395061729,
"acc_stderr": 0.023891879541959603,
"acc_norm": 0.7561728395061729,
"acc_norm_stderr": 0.023891879541959603
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.475177304964539,
"acc_stderr": 0.02979071924382972,
"acc_norm": 0.475177304964539,
"acc_norm_stderr": 0.02979071924382972
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4934810951760104,
"acc_stderr": 0.012769150688867503,
"acc_norm": 0.4934810951760104,
"acc_norm_stderr": 0.012769150688867503
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7279411764705882,
"acc_stderr": 0.02703304115168146,
"acc_norm": 0.7279411764705882,
"acc_norm_stderr": 0.02703304115168146
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6797385620915033,
"acc_stderr": 0.018875682938069443,
"acc_norm": 0.6797385620915033,
"acc_norm_stderr": 0.018875682938069443
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7551020408163265,
"acc_stderr": 0.02752963744017493,
"acc_norm": 0.7551020408163265,
"acc_norm_stderr": 0.02752963744017493
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578337,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578337
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197768,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197768
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.02917088550072767,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.02917088550072767
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3574051407588739,
"mc1_stderr": 0.01677659967672941,
"mc2": 0.5215263336816769,
"mc2_stderr": 0.015319650015486606
},
"harness|winogrande|5": {
"acc": 0.8208366219415943,
"acc_stderr": 0.010777949156047986
},
"harness|gsm8k|5": {
"acc": 0.6815769522365428,
"acc_stderr": 0.012832225723075413
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_andysalerno__openchat-nectar-0.5 | [
"region:us"
] | 2024-01-14T14:48:31+00:00 | {"pretty_name": "Evaluation run of andysalerno/openchat-nectar-0.5", "dataset_summary": "Dataset automatically created during the evaluation run of model [andysalerno/openchat-nectar-0.5](https://huggingface.co/andysalerno/openchat-nectar-0.5) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_andysalerno__openchat-nectar-0.5\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-14T14:46:05.051264](https://huggingface.co/datasets/open-llm-leaderboard/details_andysalerno__openchat-nectar-0.5/blob/main/results_2024-01-14T14-46-05.051264.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.656162636449764,\n \"acc_stderr\": 0.03183140048341385,\n \"acc_norm\": 0.6568859855566402,\n \"acc_norm_stderr\": 0.03248609595939316,\n \"mc1\": 0.3574051407588739,\n \"mc1_stderr\": 0.01677659967672941,\n \"mc2\": 0.5215263336816769,\n \"mc2_stderr\": 0.015319650015486606\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6305460750853242,\n \"acc_stderr\": 0.014104578366491892,\n \"acc_norm\": 0.6672354948805461,\n \"acc_norm_stderr\": 0.013769863046192309\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6392152957578172,\n \"acc_stderr\": 0.004792467255899766,\n \"acc_norm\": 0.835291774546903,\n \"acc_norm_stderr\": 0.003701589571274316\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5957446808510638,\n \"acc_stderr\": 0.032081157507886836,\n \"acc_norm\": 0.5957446808510638,\n \"acc_norm_stderr\": 0.032081157507886836\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.04692008381368909,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.04692008381368909\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42063492063492064,\n \"acc_stderr\": 0.025424835086924,\n \"acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.025424835086924\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5238095238095238,\n \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.5238095238095238,\n \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n \"acc_stderr\": 0.02328766512726854,\n \"acc_norm\": 0.7870967741935484,\n \"acc_norm_stderr\": 0.02328766512726854\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.031922715695483016,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.031922715695483016\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7828282828282829,\n \"acc_stderr\": 0.029376616484945627,\n \"acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.029376616484945627\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6794871794871795,\n \"acc_stderr\": 0.02366129639396428,\n \"acc_norm\": 0.6794871794871795,\n \"acc_norm_stderr\": 0.02366129639396428\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3851851851851852,\n \"acc_stderr\": 0.02967090612463088,\n \"acc_norm\": 0.3851851851851852,\n \"acc_norm_stderr\": 0.02967090612463088\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.02995382389188704,\n \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.02995382389188704\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3443708609271523,\n \"acc_stderr\": 0.03879687024073327,\n \"acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.03879687024073327\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8532110091743119,\n \"acc_stderr\": 0.015173141845126255,\n \"acc_norm\": 0.8532110091743119,\n \"acc_norm_stderr\": 0.015173141845126255\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.02584501798692692,\n \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.02584501798692692\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8185654008438819,\n \"acc_stderr\": 0.025085961144579647,\n \"acc_norm\": 0.8185654008438819,\n \"acc_norm_stderr\": 0.025085961144579647\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n \"acc_stderr\": 0.030636591348699824,\n \"acc_norm\": 0.7040358744394619,\n \"acc_norm_stderr\": 0.030636591348699824\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990947,\n \"acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990947\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.020588491316092375,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.020588491316092375\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932262,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932262\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8339719029374202,\n \"acc_stderr\": 0.013306478243066302,\n \"acc_norm\": 0.8339719029374202,\n \"acc_norm_stderr\": 0.013306478243066302\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7543352601156069,\n \"acc_stderr\": 0.023176298203992,\n \"acc_norm\": 0.7543352601156069,\n \"acc_norm_stderr\": 0.023176298203992\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27039106145251396,\n \"acc_stderr\": 0.01485499393801009,\n \"acc_norm\": 0.27039106145251396,\n \"acc_norm_stderr\": 0.01485499393801009\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7549019607843137,\n \"acc_stderr\": 0.02463004897982478,\n \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.02463004897982478\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7561728395061729,\n \"acc_stderr\": 0.023891879541959603,\n \"acc_norm\": 0.7561728395061729,\n \"acc_norm_stderr\": 0.023891879541959603\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.475177304964539,\n \"acc_stderr\": 0.02979071924382972,\n \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.02979071924382972\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4934810951760104,\n \"acc_stderr\": 0.012769150688867503,\n \"acc_norm\": 0.4934810951760104,\n \"acc_norm_stderr\": 0.012769150688867503\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7279411764705882,\n \"acc_stderr\": 0.02703304115168146,\n \"acc_norm\": 0.7279411764705882,\n \"acc_norm_stderr\": 0.02703304115168146\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6797385620915033,\n \"acc_stderr\": 0.018875682938069443,\n \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.018875682938069443\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7551020408163265,\n \"acc_stderr\": 0.02752963744017493,\n \"acc_norm\": 0.7551020408163265,\n \"acc_norm_stderr\": 0.02752963744017493\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n \"acc_stderr\": 0.025538433368578337,\n \"acc_norm\": 0.845771144278607,\n \"acc_norm_stderr\": 0.025538433368578337\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197768,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197768\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.02917088550072767,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.02917088550072767\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3574051407588739,\n \"mc1_stderr\": 0.01677659967672941,\n \"mc2\": 0.5215263336816769,\n \"mc2_stderr\": 0.015319650015486606\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8208366219415943,\n \"acc_stderr\": 0.010777949156047986\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6815769522365428,\n \"acc_stderr\": 0.012832225723075413\n }\n}\n```", "repo_url": "https://huggingface.co/andysalerno/openchat-nectar-0.5", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_14T14_46_05.051264", "path": ["**/details_harness|arc:challenge|25_2024-01-14T14-46-05.051264.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-14T14-46-05.051264.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_14T14_46_05.051264", "path": ["**/details_harness|gsm8k|5_2024-01-14T14-46-05.051264.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-14T14-46-05.051264.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_14T14_46_05.051264", "path": ["**/details_harness|hellaswag|10_2024-01-14T14-46-05.051264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-14T14-46-05.051264.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_14T14_46_05.051264", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T14-46-05.051264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T14-46-05.051264.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T14-46-05.051264.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_14T14_46_05.051264", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T14-46-05.051264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T14-46-05.051264.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_14T14_46_05.051264", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T14-46-05.051264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T14-46-05.051264.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_14T14_46_05.051264", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T14-46-05.051264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T14-46-05.051264.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_14T14_46_05.051264", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T14-46-05.051264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T14-46-05.051264.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_14T14_46_05.051264", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T14-46-05.051264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T14-46-05.051264.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_14T14_46_05.051264", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T14-46-05.051264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T14-46-05.051264.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_14T14_46_05.051264", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T14-46-05.051264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T14-46-05.051264.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_14T14_46_05.051264", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T14-46-05.051264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T14-46-05.051264.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_14T14_46_05.051264", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T14-46-05.051264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T14-46-05.051264.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_14T14_46_05.051264", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T14-46-05.051264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T14-46-05.051264.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_14T14_46_05.051264", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T14-46-05.051264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T14-46-05.051264.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_14T14_46_05.051264", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T14-46-05.051264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T14-46-05.051264.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_14T14_46_05.051264", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T14-46-05.051264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T14-46-05.051264.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_14T14_46_05.051264", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T14-46-05.051264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T14-46-05.051264.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_14T14_46_05.051264", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T14-46-05.051264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T14-46-05.051264.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_14T14_46_05.051264", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T14-46-05.051264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T14-46-05.051264.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_14T14_46_05.051264", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T14-46-05.051264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T14-46-05.051264.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_14T14_46_05.051264", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T14-46-05.051264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T14-46-05.051264.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_14T14_46_05.051264", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T14-46-05.051264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T14-46-05.051264.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_14T14_46_05.051264", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T14-46-05.051264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T14-46-05.051264.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_14T14_46_05.051264", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T14-46-05.051264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T14-46-05.051264.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_14T14_46_05.051264", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T14-46-05.051264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T14-46-05.051264.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_14T14_46_05.051264", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T14-46-05.051264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T14-46-05.051264.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_14T14_46_05.051264", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T14-46-05.051264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T14-46-05.051264.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_14T14_46_05.051264", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T14-46-05.051264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T14-46-05.051264.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_14T14_46_05.051264", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T14-46-05.051264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T14-46-05.051264.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_14T14_46_05.051264", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T14-46-05.051264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T14-46-05.051264.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_14T14_46_05.051264", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T14-46-05.051264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T14-46-05.051264.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_14T14_46_05.051264", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T14-46-05.051264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T14-46-05.051264.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_14T14_46_05.051264", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T14-46-05.051264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T14-46-05.051264.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_14T14_46_05.051264", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T14-46-05.051264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T14-46-05.051264.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_14T14_46_05.051264", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T14-46-05.051264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T14-46-05.051264.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_14T14_46_05.051264", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T14-46-05.051264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T14-46-05.051264.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_14T14_46_05.051264", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T14-46-05.051264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T14-46-05.051264.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_14T14_46_05.051264", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T14-46-05.051264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T14-46-05.051264.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_14T14_46_05.051264", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T14-46-05.051264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T14-46-05.051264.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_14T14_46_05.051264", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T14-46-05.051264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T14-46-05.051264.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_14T14_46_05.051264", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T14-46-05.051264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T14-46-05.051264.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_14T14_46_05.051264", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T14-46-05.051264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T14-46-05.051264.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_14T14_46_05.051264", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T14-46-05.051264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T14-46-05.051264.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_14T14_46_05.051264", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T14-46-05.051264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T14-46-05.051264.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_14T14_46_05.051264", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T14-46-05.051264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T14-46-05.051264.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_14T14_46_05.051264", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T14-46-05.051264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T14-46-05.051264.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_14T14_46_05.051264", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T14-46-05.051264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T14-46-05.051264.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_14T14_46_05.051264", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T14-46-05.051264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T14-46-05.051264.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_14T14_46_05.051264", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T14-46-05.051264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T14-46-05.051264.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_14T14_46_05.051264", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T14-46-05.051264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T14-46-05.051264.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_14T14_46_05.051264", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T14-46-05.051264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T14-46-05.051264.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_14T14_46_05.051264", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T14-46-05.051264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T14-46-05.051264.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_14T14_46_05.051264", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T14-46-05.051264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T14-46-05.051264.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_14T14_46_05.051264", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T14-46-05.051264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T14-46-05.051264.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_14T14_46_05.051264", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T14-46-05.051264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T14-46-05.051264.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_14T14_46_05.051264", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T14-46-05.051264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T14-46-05.051264.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_14T14_46_05.051264", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T14-46-05.051264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T14-46-05.051264.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_14T14_46_05.051264", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T14-46-05.051264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T14-46-05.051264.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_14T14_46_05.051264", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T14-46-05.051264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T14-46-05.051264.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_14T14_46_05.051264", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T14-46-05.051264.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T14-46-05.051264.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_14T14_46_05.051264", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T14-46-05.051264.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T14-46-05.051264.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_14T14_46_05.051264", "path": ["**/details_harness|winogrande|5_2024-01-14T14-46-05.051264.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-14T14-46-05.051264.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_14T14_46_05.051264", "path": ["results_2024-01-14T14-46-05.051264.parquet"]}, {"split": "latest", "path": ["results_2024-01-14T14-46-05.051264.parquet"]}]}]} | 2024-01-14T14:48:51+00:00 |
c37608c53cc555b1e99338add587d3c105a597d2 | AoZhang/nextchat-annotation | [
"region:us"
] | 2024-01-14T14:48:47+00:00 | {} | 2024-01-24T07:09:11+00:00 |
|
a770c37d5f32150de99ea297d2d1e9448298c809 |
# Dataset of magdeburg/マクデブルク/马格德堡 (Azur Lane)
This is the dataset of magdeburg/マクデブルク/马格德堡 (Azur Lane), containing 15 images and their tags.
The core tags of this character are `black_hair, horns, long_hair, breasts, multicolored_hair, red_eyes, bangs, hair_between_eyes, red_hair, large_breasts, very_long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 15 | 20.72 MiB | [Download](https://huggingface.co/datasets/CyberHarem/magdeburg_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 15 | 12.38 MiB | [Download](https://huggingface.co/datasets/CyberHarem/magdeburg_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 35 | 25.24 MiB | [Download](https://huggingface.co/datasets/CyberHarem/magdeburg_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 15 | 18.46 MiB | [Download](https://huggingface.co/datasets/CyberHarem/magdeburg_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 35 | 35.90 MiB | [Download](https://huggingface.co/datasets/CyberHarem/magdeburg_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/magdeburg_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 15 |  |  |  |  |  | 1girl, solo, navel, open_mouth, smile, looking_at_viewer, black_bikini, blush, nail_polish, thighhighs, cleavage, cloud, o-ring_bikini, outdoors, see-through, sky, tied_shirt |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | navel | open_mouth | smile | looking_at_viewer | black_bikini | blush | nail_polish | thighhighs | cleavage | cloud | o-ring_bikini | outdoors | see-through | sky | tied_shirt |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:-------------|:--------|:--------------------|:---------------|:--------|:--------------|:-------------|:-----------|:--------|:----------------|:-----------|:--------------|:------|:-------------|
| 0 | 15 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/magdeburg_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T14:51:17+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T14:55:01+00:00 |
28f622caad074d9730f577b24a82fec5e03cdd62 |
# Dataset of andrea_doria/アンドレア・ドーリア/安德烈亚·多利亚 (Azur Lane)
This is the dataset of andrea_doria/アンドレア・ドーリア/安德烈亚·多利亚 (Azur Lane), containing 11 images and their tags.
The core tags of this character are `breasts, drill_hair, large_breasts, long_hair, bangs, ahoge, yellow_eyes, brown_eyes, very_long_hair, brown_hair, hair_intakes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 11 | 18.45 MiB | [Download](https://huggingface.co/datasets/CyberHarem/andrea_doria_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 11 | 9.20 MiB | [Download](https://huggingface.co/datasets/CyberHarem/andrea_doria_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 31 | 22.33 MiB | [Download](https://huggingface.co/datasets/CyberHarem/andrea_doria_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 11 | 15.73 MiB | [Download](https://huggingface.co/datasets/CyberHarem/andrea_doria_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 31 | 34.41 MiB | [Download](https://huggingface.co/datasets/CyberHarem/andrea_doria_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/andrea_doria_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 11 |  |  |  |  |  | 1girl, long_sleeves, looking_at_viewer, smile, cleavage, blush, solo, closed_mouth, green_dress, black_pantyhose, indoors, standing, window |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | long_sleeves | looking_at_viewer | smile | cleavage | blush | solo | closed_mouth | green_dress | black_pantyhose | indoors | standing | window |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:--------------------|:--------|:-----------|:--------|:-------|:---------------|:--------------|:------------------|:----------|:-----------|:---------|
| 0 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/andrea_doria_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T14:51:19+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T14:55:18+00:00 |
f19283a231f8c2adcea48920a32fa26a7620596d |
# Dataset of z26/Z26 (Azur Lane)
This is the dataset of z26/Z26 (Azur Lane), containing 13 images and their tags.
The core tags of this character are `long_hair, pink_hair, purple_eyes, very_long_hair, twintails, bangs, hair_between_eyes, hair_ornament, hat, black_headwear, mask_on_head, sidelocks, breasts, fang, peaked_cap, small_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 13 | 15.79 MiB | [Download](https://huggingface.co/datasets/CyberHarem/z26_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 13 | 9.42 MiB | [Download](https://huggingface.co/datasets/CyberHarem/z26_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 32 | 19.83 MiB | [Download](https://huggingface.co/datasets/CyberHarem/z26_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 13 | 14.35 MiB | [Download](https://huggingface.co/datasets/CyberHarem/z26_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 32 | 27.28 MiB | [Download](https://huggingface.co/datasets/CyberHarem/z26_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/z26_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, bare_shoulders, looking_at_viewer, sleep_mask, solo, black_shirt, collarbone, short_sleeves, simple_background, white_background, holding, off-shoulder_shirt, short_shorts, :d, blush, crop_top, full_body, hairclip, navel, no_shoes, open_mouth, stuffed_toy, white_socks |
| 1 | 6 |  |  |  |  |  | 1girl, black_skirt, looking_at_viewer, midriff, miniskirt, navel, solo, boots, capelet, hair_flaps, pleated_skirt, red_gloves, suspender_skirt, ahoge, closed_mouth, full_body, simple_background, socks, v-shaped_eyebrows, white_background, armpits, black_shirt, buttons, crop_top_overhang, frown, outstretched_arm, revealing_clothes, sleeveless_shirt, stomach, thigh_strap, thighs, undershirt |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | looking_at_viewer | sleep_mask | solo | black_shirt | collarbone | short_sleeves | simple_background | white_background | holding | off-shoulder_shirt | short_shorts | :d | blush | crop_top | full_body | hairclip | navel | no_shoes | open_mouth | stuffed_toy | white_socks | black_skirt | midriff | miniskirt | boots | capelet | hair_flaps | pleated_skirt | red_gloves | suspender_skirt | ahoge | closed_mouth | socks | v-shaped_eyebrows | armpits | buttons | crop_top_overhang | frown | outstretched_arm | revealing_clothes | sleeveless_shirt | stomach | thigh_strap | thighs | undershirt |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:--------------------|:-------------|:-------|:--------------|:-------------|:----------------|:--------------------|:-------------------|:----------|:---------------------|:---------------|:-----|:--------|:-----------|:------------|:-----------|:--------|:-----------|:-------------|:--------------|:--------------|:--------------|:----------|:------------|:--------|:----------|:-------------|:----------------|:-------------|:------------------|:--------|:---------------|:--------|:--------------------|:----------|:----------|:--------------------|:--------|:-------------------|:--------------------|:-------------------|:----------|:--------------|:---------|:-------------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | | X | | X | X | | | X | X | | | | | | | X | | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/z26_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T14:51:19+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T14:55:27+00:00 |
26aad0b206850c4bdc5613a05ea1545fa4c964bb | Navvye/TrialTSV | [
"license:mit",
"region:us"
] | 2024-01-14T14:57:11+00:00 | {"license": "mit"} | 2024-01-14T14:57:47+00:00 |
|
8e72de89ef59e9a969fcea910c6806080dc33881 | icewiny/blur_cn | [
"license:mit",
"region:us"
] | 2024-01-14T15:00:36+00:00 | {"license": "mit"} | 2024-01-14T15:00:36+00:00 |
|
9e48ba6b3eda5d1715621a299d7d9ab7140d8068 | # Promoter Sequences for Various plant species
The data in this dataset has the promoter sequences for **241 different plant species** and has been used for the pretraining step of [`Florabert`](https://huggingface.co/Gurveer05/FloraBERT). It has been created by processing the raw fasta files and the gff3 / gff files from [`Ensembl`](https://plants.ensembl.org/) and [`Refseq`](https://www.ncbi.nlm.nih.gov/refseq/).
*samtools* and *bedtools* have been used to extract the promoter sequences from these that are 1kb upstream of the sequence.
The data has been split into train and test data (90-10 split). In all, there are ~ 10 million sequences across the split files. The steps followed to obtain this data are available in this [`Github Repository`](https://github.com/gurveervirk/florabert). | Gurveer05/plant-promoter-sequences | [
"size_categories:10M<n<100M",
"biology",
"region:us"
] | 2024-01-14T15:03:50+00:00 | {"size_categories": ["10M<n<100M"], "tags": ["biology"]} | 2024-01-14T17:26:56+00:00 |
6f3fb8fcd91ea065cafa081377328757f91641e2 |
# Dataset of friedrich_eckoldt/Z16 (Azur Lane)
This is the dataset of friedrich_eckoldt/Z16 (Azur Lane), containing 11 images and their tags.
The core tags of this character are `black_hair, multicolored_hair, red_eyes, streaked_hair, bangs, breasts, long_hair, white_hair, horns, x-shaped_pupils, symbol-shaped_pupils, two-tone_hair, hair_between_eyes, v-shaped_eyebrows`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 11 | 15.98 MiB | [Download](https://huggingface.co/datasets/CyberHarem/friedrich_eckoldt_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 11 | 8.13 MiB | [Download](https://huggingface.co/datasets/CyberHarem/friedrich_eckoldt_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 26 | 17.25 MiB | [Download](https://huggingface.co/datasets/CyberHarem/friedrich_eckoldt_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 11 | 12.94 MiB | [Download](https://huggingface.co/datasets/CyberHarem/friedrich_eckoldt_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 26 | 26.02 MiB | [Download](https://huggingface.co/datasets/CyberHarem/friedrich_eckoldt_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/friedrich_eckoldt_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 11 |  |  |  |  |  | 1girl, looking_at_viewer, solo, navel, black_jacket, bare_shoulders, crop_top, long_sleeves, midriff, stomach, black_thighhighs, iron_cross, off_shoulder, standing, thigh_strap, white_panties, cowboy_shot, open_mouth, red_gloves, simple_background, skindentation, thighs, white_shirt, :d, black_footwear, black_gloves, full_body, open_jacket, sharp_teeth, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | navel | black_jacket | bare_shoulders | crop_top | long_sleeves | midriff | stomach | black_thighhighs | iron_cross | off_shoulder | standing | thigh_strap | white_panties | cowboy_shot | open_mouth | red_gloves | simple_background | skindentation | thighs | white_shirt | :d | black_footwear | black_gloves | full_body | open_jacket | sharp_teeth | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:--------|:---------------|:-----------------|:-----------|:---------------|:----------|:----------|:-------------------|:-------------|:---------------|:-----------|:--------------|:----------------|:--------------|:-------------|:-------------|:--------------------|:----------------|:---------|:--------------|:-----|:-----------------|:---------------|:------------|:--------------|:--------------|:-------------------|
| 0 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/friedrich_eckoldt_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T15:08:50+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T15:11:51+00:00 |
e56367f7062e03b7ad7c3e9dc5d05d85eb5bd5f8 | modelloosrvcc/WukongMico | [
"license:openrail",
"region:us"
] | 2024-01-14T15:11:11+00:00 | {"license": "openrail"} | 2024-01-14T15:11:57+00:00 |
|
f168c18bea3933d013832f904b7558f877993d2b | sayalishankar/requireddataset | [
"region:us"
] | 2024-01-14T15:18:32+00:00 | {} | 2024-01-14T15:21:23+00:00 |
|
9958f85a0ed801a4a28f3c205b7b910a34b2da27 | RamazanTM/EngRussPretrain | [
"license:openrail",
"region:us"
] | 2024-01-14T15:18:58+00:00 | {"license": "openrail"} | 2024-01-14T15:35:25+00:00 |
|
155539f0708c88ca3d1554453ee7d95a900abf85 |
# Dataset of louisville/ルイビル/路易斯维尔 (Azur Lane)
This is the dataset of louisville/ルイビル/路易斯维尔 (Azur Lane), containing 18 images and their tags.
The core tags of this character are `breasts, long_hair, hair_over_one_eye, large_breasts, blue_eyes, braid, bow, animal_ears, fake_animal_ears, hair_ornament, rabbit_ears, pink_hair, huge_breasts, very_long_hair, blue_bow, purple_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:---------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 18 | 28.87 MiB | [Download](https://huggingface.co/datasets/CyberHarem/louisville_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 18 | 16.09 MiB | [Download](https://huggingface.co/datasets/CyberHarem/louisville_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 46 | 35.21 MiB | [Download](https://huggingface.co/datasets/CyberHarem/louisville_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 18 | 25.31 MiB | [Download](https://huggingface.co/datasets/CyberHarem/louisville_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 46 | 51.60 MiB | [Download](https://huggingface.co/datasets/CyberHarem/louisville_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/louisville_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 13 |  |  |  |  |  | bare_shoulders, bowtie, cleavage, detached_collar, playboy_bunny, 1girl, looking_at_viewer, white_gloves, blue_leotard, solo, blush, white_pantyhose, official_alternate_costume, strapless_leotard, holding_tray, breast_rest |
| 1 | 5 |  |  |  |  |  | 1girl, cleavage, long_sleeves, solo, dress, looking_at_viewer, white_gloves, white_thighhighs, blush, frills, garter_straps, simple_background, white_background, bangs, clothes_lift, full_body, lifted_by_self, skirt, smile, white_panties |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | bare_shoulders | bowtie | cleavage | detached_collar | playboy_bunny | 1girl | looking_at_viewer | white_gloves | blue_leotard | solo | blush | white_pantyhose | official_alternate_costume | strapless_leotard | holding_tray | breast_rest | long_sleeves | dress | white_thighhighs | frills | garter_straps | simple_background | white_background | bangs | clothes_lift | full_body | lifted_by_self | skirt | smile | white_panties |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------|:---------|:-----------|:------------------|:----------------|:--------|:--------------------|:---------------|:---------------|:-------|:--------|:------------------|:-----------------------------|:--------------------|:---------------|:--------------|:---------------|:--------|:-------------------|:---------|:----------------|:--------------------|:-------------------|:--------|:---------------|:------------|:-----------------|:--------|:--------|:----------------|
| 0 | 13 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | | | X | | | X | X | X | | X | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/louisville_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T15:29:53+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T15:34:10+00:00 |
aadc3836f337cb17823e07dd83cb7d3773af9903 |
# Dataset of matchless/マッチレス/无敌 (Azur Lane)
This is the dataset of matchless/マッチレス/无敌 (Azur Lane), containing 16 images and their tags.
The core tags of this character are `purple_hair, hat, purple_eyes, short_hair, bangs, breasts, beret, black_headwear, ribbon, hair_ornament, rabbit_hair_ornament`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 16 | 18.62 MiB | [Download](https://huggingface.co/datasets/CyberHarem/matchless_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 16 | 10.93 MiB | [Download](https://huggingface.co/datasets/CyberHarem/matchless_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 35 | 22.27 MiB | [Download](https://huggingface.co/datasets/CyberHarem/matchless_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 16 | 16.02 MiB | [Download](https://huggingface.co/datasets/CyberHarem/matchless_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 35 | 29.23 MiB | [Download](https://huggingface.co/datasets/CyberHarem/matchless_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/matchless_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 7 |  |  |  |  |  | 1girl, bare_shoulders, blush, white_dress, looking_at_viewer, smile, solo, choker, collarbone, pink_bow, white_footwear, bag, holding_food, ice_cream_cone, lifebuoy, shoes, tongue_out, bench, bird, brick_floor, closed_mouth, military_hat, off_shoulder, sitting, torpedo_tubes |
| 1 | 9 |  |  |  |  |  | 1girl, looking_at_viewer, solo, bare_shoulders, black_gloves, blush, smile, open_mouth, sleeveless_shirt, pink_skirt, star_(symbol), white_shirt, kneehighs, white_socks, ;d, full_body, mole, one_eye_closed |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | blush | white_dress | looking_at_viewer | smile | solo | choker | collarbone | pink_bow | white_footwear | bag | holding_food | ice_cream_cone | lifebuoy | shoes | tongue_out | bench | bird | brick_floor | closed_mouth | military_hat | off_shoulder | sitting | torpedo_tubes | black_gloves | open_mouth | sleeveless_shirt | pink_skirt | star_(symbol) | white_shirt | kneehighs | white_socks | ;d | full_body | mole | one_eye_closed |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:--------|:--------------|:--------------------|:--------|:-------|:---------|:-------------|:-----------|:-----------------|:------|:---------------|:-----------------|:-----------|:--------|:-------------|:--------|:-------|:--------------|:---------------|:---------------|:---------------|:----------|:----------------|:---------------|:-------------|:-------------------|:-------------|:----------------|:--------------|:------------|:--------------|:-----|:------------|:-------|:-----------------|
| 0 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 1 | 9 |  |  |  |  |  | X | X | X | | X | X | X | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/matchless_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T15:30:03+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T15:33:31+00:00 |
e0b052512c0312b3a1ee43bf1700168a8c607c13 | valerievloef/Thesis_BERT_part | [
"license:apache-2.0",
"region:us"
] | 2024-01-14T15:34:16+00:00 | {"license": "apache-2.0"} | 2024-01-14T15:34:51+00:00 |
|
3531d9a7a8a01c1619b297a509fb9485b37d80ae |
This is a unfiltered dataset of images scraped from the internet for the Detective Conan character Masumi Sera | 234bcn/masumi_sera_images | [
"language:en",
"region:us"
] | 2024-01-14T15:41:22+00:00 | {"language": ["en"], "pretty_name": "Masumi Sera Dataset for AI"} | 2024-01-15T18:38:20+00:00 |
6341b7a2621088af05364a7a711c811b422e4b96 | mydesigns82/shiba | [
"license:mit",
"region:us"
] | 2024-01-14T15:41:38+00:00 | {"license": "mit"} | 2024-01-14T15:47:52+00:00 |
|
3d1bfe3ed7a68e99e2780155b14d38956b76d5ea | # Dataset Card for "real-toxicity-prompts_first_5K"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | Leogrin/real-toxicity-prompts_first_5K | [
"region:us"
] | 2024-01-14T15:48:57+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "filename", "dtype": "string"}, {"name": "begin", "dtype": "int64"}, {"name": "end", "dtype": "int64"}, {"name": "challenging", "dtype": "bool"}, {"name": "prompt", "struct": [{"name": "text", "dtype": "string"}, {"name": "profanity", "dtype": "float64"}, {"name": "sexually_explicit", "dtype": "float64"}, {"name": "identity_attack", "dtype": "float64"}, {"name": "flirtation", "dtype": "float64"}, {"name": "threat", "dtype": "float64"}, {"name": "insult", "dtype": "float64"}, {"name": "severe_toxicity", "dtype": "float64"}, {"name": "toxicity", "dtype": "float64"}]}, {"name": "continuation", "struct": [{"name": "text", "dtype": "string"}, {"name": "severe_toxicity", "dtype": "float64"}, {"name": "toxicity", "dtype": "float64"}, {"name": "profanity", "dtype": "float64"}, {"name": "sexually_explicit", "dtype": "float64"}, {"name": "identity_attack", "dtype": "float64"}, {"name": "flirtation", "dtype": "float64"}, {"name": "threat", "dtype": "float64"}, {"name": "insult", "dtype": "float64"}]}], "splits": [{"name": "train", "num_bytes": 1701249, "num_examples": 5000}], "download_size": 1566036, "dataset_size": 1701249}} | 2024-01-14T15:48:59+00:00 |
60c898ac22737376ba892d481d9a728ea8fafd49 | hojzas/test | [
"license:apache-2.0",
"region:us"
] | 2024-01-14T15:49:59+00:00 | {"license": "apache-2.0"} | 2024-01-14T15:49:59+00:00 |
|
5e830fb2119498b53299c76b814f2fe82ee601ff | hojzas/autotrain-data-test | [
"license:apache-2.0",
"region:us"
] | 2024-01-14T15:50:58+00:00 | {"license": "apache-2.0", "dataset_info": {"features": [{"name": "autotrain_text", "dtype": "string"}, {"name": "autotrain_label", "dtype": {"class_label": {"names": {"0": "negative", "1": "positive"}}}}], "splits": [{"name": "train", "num_bytes": 167, "num_examples": 6}, {"name": "validation", "num_bytes": 52, "num_examples": 2}], "download_size": 3117, "dataset_size": 219}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}]} | 2024-01-14T15:54:34+00:00 |
|
3892020f4cf0f9f3beb54f50a1cc771774bb9d1a | supundhananjaya/MonoGeoDepth-dataset | [
"region:us"
] | 2024-01-14T16:00:34+00:00 | {} | 2024-01-14T16:00:34+00:00 |
|
1e0846f979f3900068fffebbfa681ccaaf165aea | EllieS/pubmedqa_dpo_data | [
"region:us"
] | 2024-01-14T16:01:00+00:00 | {} | 2024-01-16T04:29:02+00:00 |
|
0d7215891cccba2036f851aee05b2a1a8ed76e9f | cmolinier/pokemon_diamond_ost_mid_tkn | [
"region:us"
] | 2024-01-14T16:01:50+00:00 | {"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 15970638, "num_examples": 592}], "download_size": 2228577, "dataset_size": 15970638}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-14T16:06:33+00:00 |
|
dfccbf5d5206330d074a0d31e1a564b68bf6a4eb |
# DeliData
This is a README that outlines key fields and characteristics of the DeliData corpus.
For full description of how we collected DeliData, as well as possible applications, please refer to the original
paper [link](#citation).
# Data Fields
###### group_id
Unique identifier of the group chat
###### message_id
Message identifier. System messages will have an id of -1, however all participant messages' ids are unique.
###### message_type
INITIAL - indicating the cards presented and aliases of participants;
SUBMIT - indicating that a participant has pressed the Submit Solution button
MESSAGE - noting a chat entry
###### origin
The alias of the participant who submitted a message/solution
###### original_text
Original text as said in the collected conversation;
For INITIAL type, contains the list of participants and cards presented.
For SUBMIT type, contains the cards submitted
###### clean_text
Normalised message, with applied tokenisation, and masking of special tokens. Special tokens are considered solution
mentions, which are masked with < CARD > and participant mentions which are masked with < MENTION >
###### annotation_type
A record from the first level of DeliAnnotation. Can be Probing, Non-probing deliberation, or None. For more details,
please refer to the DeliData paper.
###### annotation_target
A record from the second level of DeliAnnotation. Can be Moderation, Reasoning, Solution, Agree, or Disagree. For more
details, please refer to the DeliData paper.
###### annotation_additional
A record from the third level of DeliAnnotation. Can be partial_solution, complete_solution, specific_referee,
solution_summary, or consider_opposite. For more details, please refer to the DeliData paper.
###### team_performance
An approximation of team performance, based on user submissions, and solution mentions. Range [0-1], where 1 indicates
each participant selecting the correct solution.
###### performance_change
Change of performance based compared to the previous utterance
###### sol_tracker_message
Extracted solution from the current message
###### sol_tracker_all
Up-to-date "state-of-mind" for each of the participants, i.e. an approximation of what each participant think the
correct solution is at given timestep. This is based on initial solutions, submitted solutions, and solution mentions.
team_performance value is calculated based on this column
### Citation
**DeliData A dataset for deliberation in multi-party problem solving (https://delibot.xyz/delidata)**
@article{karadzhov2023delidata,
title={DeliData: A dataset for deliberation in multi-party problem solving},
author={Karadzhov, Georgi and Stafford, Tom and Vlachos, Andreas},
journal={Proceedings of the ACM on Human-Computer Interaction},
volume={7},
number={CSCW2},
pages={1--25},
year={2023},
publisher={ACM New York, NY, USA}
}
| gkaradzhov/DeliData | [
"license:cc-by-4.0",
"region:us"
] | 2024-01-14T16:08:40+00:00 | {"license": "cc-by-4.0"} | 2024-01-14T16:10:42+00:00 |
0354f80bb4095c3e18800966ad1b6a296cfcc530 | model: https://huggingface.co/sentence-transformers/clip-ViT-L-14 # 1.71GB | teamtom/25000_word_emb_large | [
"license:apache-2.0",
"region:us"
] | 2024-01-14T16:10:20+00:00 | {"license": "apache-2.0"} | 2024-01-14T16:12:52+00:00 |
9b60961400df84d2e0ac0015d34b0679cb53bb1a | YigitKoca/MMLU_EN_1 | [
"region:us"
] | 2024-01-14T16:10:34+00:00 | {} | 2024-01-14T16:16:10+00:00 |
|
3403562f12c2649f36539e093aef7eb37c59314f | GalDude33/Splats | [
"region:us"
] | 2024-01-14T16:10:36+00:00 | {} | 2024-01-14T16:18:50+00:00 |
|
6efa9ce2552dc2d550d6bca8fa8329f9515dff54 | Andongne/repo_despina | [
"region:us"
] | 2024-01-14T16:13:44+00:00 | {} | 2024-01-14T16:13:44+00:00 |
|
abe125a775a4f8d861497973dba3904581020460 | Andongne/despina | [
"region:us"
] | 2024-01-14T16:14:05+00:00 | {} | 2024-01-14T17:07:05+00:00 |
|
e15cafc333c59626d359a2c10516ae667e43c6ff | zhanjun/cfff_qianyi | [
"region:us"
] | 2024-01-14T16:19:14+00:00 | {} | 2024-01-14T17:35:07+00:00 |
|
06a8818ab1c008a02ce8e6f140f6372bc74707c8 | yeager89/mikasa | [
"region:us"
] | 2024-01-14T16:23:47+00:00 | {} | 2024-01-14T16:26:46+00:00 |
|
d4f20ff2896edaf78b384691615f3b083f27a18d | # Promoter Sequences for Maize NAM lines
The data in this dataset has the promoter sequences for **26 Maize NAM lines** and has been used for the finetuning step of [`Florabert`](https://huggingface.co/Gurveer05/FloraBERT). It has been created by processing the raw fasta files and the gff3 files from [`MaizeGDB`](https://www.maizegdb.org/) for the 26 NAM lines.
*samtools* and *bedtools* have been used to extract the promoter sequences from these that are 1kb upstream of the sequence.
The data has been split into train and test data (70-30 split). In all, there are ~ 1 million sequences across the files. The steps followed to obtain this data are available in this [`Github Repository`](https://github.com/gurveervirk/florabert). | Gurveer05/maize-promoter-sequences | [
"size_categories:1M<n<10M",
"biology",
"region:us"
] | 2024-01-14T16:24:21+00:00 | {"size_categories": ["1M<n<10M"], "tags": ["biology"]} | 2024-01-14T17:13:53+00:00 |
531f2de26dada8bcc16f1f3f049baf3cae916dff |
# A classics data set for use with mistral-7b-v0.1
This dataset was used for a fine-tune of Mistral 7b base model. It contains 1,640 Q/A pairs on Greek & Roman history.
The dataset was generated via Mixtral-8x7b Instruct v01, run over 512 token-length chunks of vol's 2&3 of Will Durants' 13 vol **Story of Civilization** (*Life of Greece* and *Caesar & Christ*).
Training data was formatted with [INST] and [/INST] delimiting instructions:
```bash
{"text": "Q: \"Why did many Greeks come to resent Rome's 'liberation' and 'peacekeeping' efforts, such as forbidding class war and interfering in disputes, despite Rome having given Greece freedom from previous conflicts?\"\nA: Many Greeks came to resent Rome's \"liberation\" and \"peacekeeping\" efforts due to several reasons. First, after the Romans had given Greece freedom...(blah blah blah)...interfering in their domestic affairs, and ultimately"}
``` | wmmarcellino/mistral-7b-v0.1-GreeceRome-v0.1 | [
"language:en",
"license:apache-2.0",
"region:us"
] | 2024-01-14T16:25:18+00:00 | {"language": ["en"], "license": "apache-2.0"} | 2024-01-14T16:29:07+00:00 |
a7a7aafc319d1aaebcd6b6deb1c8c1477f770dcd | Symfomany/datasllm | [
"license:apache-2.0",
"region:us"
] | 2024-01-14T16:27:31+00:00 | {"license": "apache-2.0", "dataset_info": {"features": [{"name": "nom", "dtype": "string"}, {"name": "reconnaitre", "dtype": "string"}, {"name": "important", "dtype": "string"}, {"name": "prevention", "dtype": "string"}, {"name": "lutter", "dtype": "string"}, {"name": "traitement", "dtype": "string"}, {"name": "chat_sample", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 108814, "num_examples": 19}], "download_size": 96525, "dataset_size": 108814}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-15T11:11:46+00:00 |
|
34620ebdf88d4cdb6c109ce36b4f790504638da7 | bhargavi909/Gene_expressions_UCI | [
"region:us"
] | 2024-01-14T16:35:45+00:00 | {} | 2024-01-14T16:35:45+00:00 |
|
2c0e7cc90901e2f969e9aee695ac4a484ef0e1c4 | Atipico1/popQA_preprocessed_unans | [
"region:us"
] | 2024-01-14T16:43:27+00:00 | {"dataset_info": {"features": [{"name": "id", "dtype": "int64"}, {"name": "subj", "dtype": "string"}, {"name": "prop", "dtype": "string"}, {"name": "obj", "dtype": "string"}, {"name": "subj_id", "dtype": "int64"}, {"name": "prop_id", "dtype": "int64"}, {"name": "obj_id", "dtype": "int64"}, {"name": "s_aliases", "dtype": "string"}, {"name": "o_aliases", "dtype": "string"}, {"name": "s_uri", "dtype": "string"}, {"name": "o_uri", "dtype": "string"}, {"name": "s_wiki_title", "dtype": "string"}, {"name": "o_wiki_title", "dtype": "string"}, {"name": "s_pop", "dtype": "int64"}, {"name": "o_pop", "dtype": "int64"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": "string"}, {"name": "ctxs", "list": [{"name": "hasanswer", "dtype": "bool"}, {"name": "id", "dtype": "string"}, {"name": "score", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "title", "dtype": "string"}]}, {"name": "query_embedding", "sequence": "float32"}], "splits": [{"name": "train", "num_bytes": 100743049, "num_examples": 10000}, {"name": "test", "num_bytes": 42959579, "num_examples": 4267}], "download_size": 81183565, "dataset_size": 143702628}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2024-01-14T16:43:45+00:00 |
|
1d7ac5906a82d28d43ee632508000e8661b185cb | paulparas/code-review | [
"region:us"
] | 2024-01-14T16:43:59+00:00 | {} | 2024-01-14T16:43:59+00:00 |
|
a721f15f217cb8cccb302a2cab0232077d7db836 | Jungliana/classical-music-abc | [
"region:us"
] | 2024-01-14T16:45:33+00:00 | {} | 2024-01-14T16:49:25+00:00 |
|
9b7bc762ce9d9b7747ec37995e2f9267f91885eb | Vitorbr2009/ds-voz-afauna | [
"license:openrail",
"region:us"
] | 2024-01-14T16:51:06+00:00 | {"license": "openrail"} | 2024-01-14T16:52:02+00:00 |
|
5e4e0fbe391f887c2df1eea061426ec5dc84d9f3 | ganoot/ut-courses | [
"region:us"
] | 2024-01-14T16:55:45+00:00 | {} | 2024-01-14T16:59:49+00:00 |
|
1a47923fb183e14b2674f79c54d0ab1cf223a2e0 |
To watch a video on how this dataset was created, watch the following videos:
Are words free?:
* https://youtu.be/Utg_D-yQB_E?si=FKp_QZ4PbKesiDrn
Replacing Chatgpt 3.5 turbo workflows with Openchat:
* https://youtu.be/DNKepnKuZns?si=bleufaiGdwGdrueK
| russellbal/dictionary-openchat-3.5-0106 | [
"license:wtfpl",
"region:us"
] | 2024-01-14T17:02:26+00:00 | {"license": "wtfpl"} | 2024-01-14T18:00:23+00:00 |
8c09f93f609bda3b2f1e93ed27fb9996b52cb250 | Carlosgg14/gokublack | [
"license:openrail",
"region:us"
] | 2024-01-14T17:05:46+00:00 | {"license": "openrail"} | 2024-01-14T17:07:08+00:00 |
|
42150099223f14cf78c039bdd1e80e85da97cc04 |
# Alpaca Hindi Small
This is a synthesized dataset created by translation of alpaca dataset from English to Hindi language. | QuantumMik/alpaca_hindi_small | [
"task_categories:question-answering",
"task_categories:text-generation",
"size_categories:1K<n<10K",
"language:hi",
"license:apache-2.0",
"region:us"
] | 2024-01-14T17:06:44+00:00 | {"language": ["hi"], "license": "apache-2.0", "size_categories": ["1K<n<10K"], "task_categories": ["question-answering", "text-generation"]} | 2024-01-16T16:02:58+00:00 |
5ab29f75b16751b2725bddb6c6f57f6327b7746e | epinnock/software-architecture-instructions | [
"region:us"
] | 2024-01-14T17:20:08+00:00 | {"dataset_info": {"features": [{"name": "instructions", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 24799, "num_examples": 210}], "download_size": 10040, "dataset_size": 24799}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-14T17:22:57+00:00 |
|
649e30f1ec53be5822239f86f98b244cdbfdc414 | pouya-haghi/imagenet-2k | [
"region:us"
] | 2024-01-14T17:20:40+00:00 | {"dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "label", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 86120502.0, "num_examples": 2048}], "download_size": 86073892, "dataset_size": 86120502.0}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-14T17:20:54+00:00 |
|
7f4e1788c0add94483c53185193c582bb1c2c4c2 |
# Dataset Card for Evaluation run of Felladrin/Llama-68M-Chat-v1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Felladrin/Llama-68M-Chat-v1](https://huggingface.co/Felladrin/Llama-68M-Chat-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Felladrin__Llama-68M-Chat-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T17:25:12.605913](https://huggingface.co/datasets/open-llm-leaderboard/details_Felladrin__Llama-68M-Chat-v1/blob/main/results_2024-01-14T17-25-12.605913.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2518558528274769,
"acc_stderr": 0.030387282193610175,
"acc_norm": 0.25203959947439164,
"acc_norm_stderr": 0.031196164528136557,
"mc1": 0.2741738066095471,
"mc1_stderr": 0.015616518497219376,
"mc2": 0.4726841055154348,
"mc2_stderr": 0.015727848850119193
},
"harness|arc:challenge|25": {
"acc": 0.1885665529010239,
"acc_stderr": 0.011430897647675815,
"acc_norm": 0.23293515358361774,
"acc_norm_stderr": 0.012352507042617405
},
"harness|hellaswag|10": {
"acc": 0.27693686516630156,
"acc_stderr": 0.004465704810893541,
"acc_norm": 0.28271260705038836,
"acc_norm_stderr": 0.004493975527386726
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.22962962962962963,
"acc_stderr": 0.03633384414073461,
"acc_norm": 0.22962962962962963,
"acc_norm_stderr": 0.03633384414073461
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.17,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.17,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21132075471698114,
"acc_stderr": 0.025125766484827845,
"acc_norm": 0.21132075471698114,
"acc_norm_stderr": 0.025125766484827845
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2152777777777778,
"acc_stderr": 0.03437079344106135,
"acc_norm": 0.2152777777777778,
"acc_norm_stderr": 0.03437079344106135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.30057803468208094,
"acc_stderr": 0.03496101481191181,
"acc_norm": 0.30057803468208094,
"acc_norm_stderr": 0.03496101481191181
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.19,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.19,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.1829787234042553,
"acc_stderr": 0.025276041000449966,
"acc_norm": 0.1829787234042553,
"acc_norm_stderr": 0.025276041000449966
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.21929824561403508,
"acc_stderr": 0.03892431106518754,
"acc_norm": 0.21929824561403508,
"acc_norm_stderr": 0.03892431106518754
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.20689655172413793,
"acc_stderr": 0.03375672449560554,
"acc_norm": 0.20689655172413793,
"acc_norm_stderr": 0.03375672449560554
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.022569897074918417,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.022569897074918417
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.15079365079365079,
"acc_stderr": 0.03200686497287392,
"acc_norm": 0.15079365079365079,
"acc_norm_stderr": 0.03200686497287392
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3161290322580645,
"acc_stderr": 0.02645087448904277,
"acc_norm": 0.3161290322580645,
"acc_norm_stderr": 0.02645087448904277
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.26108374384236455,
"acc_stderr": 0.030903796952114468,
"acc_norm": 0.26108374384236455,
"acc_norm_stderr": 0.030903796952114468
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.17,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.17,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2545454545454545,
"acc_stderr": 0.03401506715249039,
"acc_norm": 0.2545454545454545,
"acc_norm_stderr": 0.03401506715249039
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.3383838383838384,
"acc_stderr": 0.03371124142626302,
"acc_norm": 0.3383838383838384,
"acc_norm_stderr": 0.03371124142626302
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.33678756476683935,
"acc_stderr": 0.03410780251836184,
"acc_norm": 0.33678756476683935,
"acc_norm_stderr": 0.03410780251836184
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.34102564102564104,
"acc_stderr": 0.02403548967633507,
"acc_norm": 0.34102564102564104,
"acc_norm_stderr": 0.02403548967633507
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.026842057873833706,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.026842057873833706
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3445378151260504,
"acc_stderr": 0.030868682604121633,
"acc_norm": 0.3445378151260504,
"acc_norm_stderr": 0.030868682604121633
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.03861557546255169,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.03861557546255169
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.23853211009174313,
"acc_stderr": 0.01827257581023187,
"acc_norm": 0.23853211009174313,
"acc_norm_stderr": 0.01827257581023187
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.030190282453501947,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.030190282453501947
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2489451476793249,
"acc_stderr": 0.028146970599422644,
"acc_norm": 0.2489451476793249,
"acc_norm_stderr": 0.028146970599422644
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.21076233183856502,
"acc_stderr": 0.027373095500540193,
"acc_norm": 0.21076233183856502,
"acc_norm_stderr": 0.027373095500540193
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.25190839694656486,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.25190839694656486,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.24793388429752067,
"acc_stderr": 0.03941897526516303,
"acc_norm": 0.24793388429752067,
"acc_norm_stderr": 0.03941897526516303
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.03755265865037181,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.03755265865037181
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2331288343558282,
"acc_stderr": 0.033220157957767414,
"acc_norm": 0.2331288343558282,
"acc_norm_stderr": 0.033220157957767414
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.1875,
"acc_stderr": 0.0370468111477387,
"acc_norm": 0.1875,
"acc_norm_stderr": 0.0370468111477387
},
"harness|hendrycksTest-management|5": {
"acc": 0.22330097087378642,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.22330097087378642,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.19658119658119658,
"acc_stderr": 0.02603538609895129,
"acc_norm": 0.19658119658119658,
"acc_norm_stderr": 0.02603538609895129
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2835249042145594,
"acc_stderr": 0.01611731816683228,
"acc_norm": 0.2835249042145594,
"acc_norm_stderr": 0.01611731816683228
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24134078212290502,
"acc_stderr": 0.014310999547961438,
"acc_norm": 0.24134078212290502,
"acc_norm_stderr": 0.014310999547961438
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351298,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351298
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2990353697749196,
"acc_stderr": 0.026003301117885135,
"acc_norm": 0.2990353697749196,
"acc_norm_stderr": 0.026003301117885135
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.24691358024691357,
"acc_stderr": 0.02399350170904211,
"acc_norm": 0.24691358024691357,
"acc_norm_stderr": 0.02399350170904211
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2553191489361702,
"acc_stderr": 0.02601199293090201,
"acc_norm": 0.2553191489361702,
"acc_norm_stderr": 0.02601199293090201
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24511082138200782,
"acc_stderr": 0.010986307870045517,
"acc_norm": 0.24511082138200782,
"acc_norm_stderr": 0.010986307870045517
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.029896163033125478,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.029896163033125478
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.017630827375148383,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.017630827375148383
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2,
"acc_stderr": 0.03831305140884603,
"acc_norm": 0.2,
"acc_norm_stderr": 0.03831305140884603
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.23673469387755103,
"acc_stderr": 0.02721283588407316,
"acc_norm": 0.23673469387755103,
"acc_norm_stderr": 0.02721283588407316
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23880597014925373,
"acc_stderr": 0.030147775935409224,
"acc_norm": 0.23880597014925373,
"acc_norm_stderr": 0.030147775935409224
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-virology|5": {
"acc": 0.25301204819277107,
"acc_stderr": 0.033844291552331346,
"acc_norm": 0.25301204819277107,
"acc_norm_stderr": 0.033844291552331346
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.0312678171466318,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.0312678171466318
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2741738066095471,
"mc1_stderr": 0.015616518497219376,
"mc2": 0.4726841055154348,
"mc2_stderr": 0.015727848850119193
},
"harness|winogrande|5": {
"acc": 0.5430149960536701,
"acc_stderr": 0.01400038676159829
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Felladrin__Llama-68M-Chat-v1 | [
"region:us"
] | 2024-01-14T17:27:02+00:00 | {"pretty_name": "Evaluation run of Felladrin/Llama-68M-Chat-v1", "dataset_summary": "Dataset automatically created during the evaluation run of model [Felladrin/Llama-68M-Chat-v1](https://huggingface.co/Felladrin/Llama-68M-Chat-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Felladrin__Llama-68M-Chat-v1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-14T17:25:12.605913](https://huggingface.co/datasets/open-llm-leaderboard/details_Felladrin__Llama-68M-Chat-v1/blob/main/results_2024-01-14T17-25-12.605913.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2518558528274769,\n \"acc_stderr\": 0.030387282193610175,\n \"acc_norm\": 0.25203959947439164,\n \"acc_norm_stderr\": 0.031196164528136557,\n \"mc1\": 0.2741738066095471,\n \"mc1_stderr\": 0.015616518497219376,\n \"mc2\": 0.4726841055154348,\n \"mc2_stderr\": 0.015727848850119193\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.1885665529010239,\n \"acc_stderr\": 0.011430897647675815,\n \"acc_norm\": 0.23293515358361774,\n \"acc_norm_stderr\": 0.012352507042617405\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.27693686516630156,\n \"acc_stderr\": 0.004465704810893541,\n \"acc_norm\": 0.28271260705038836,\n \"acc_norm_stderr\": 0.004493975527386726\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.22962962962962963,\n \"acc_stderr\": 0.03633384414073461,\n \"acc_norm\": 0.22962962962962963,\n \"acc_norm_stderr\": 0.03633384414073461\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.17,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.17,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.21132075471698114,\n \"acc_stderr\": 0.025125766484827845,\n \"acc_norm\": 0.21132075471698114,\n \"acc_norm_stderr\": 0.025125766484827845\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2152777777777778,\n \"acc_stderr\": 0.03437079344106135,\n \"acc_norm\": 0.2152777777777778,\n \"acc_norm_stderr\": 0.03437079344106135\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.30057803468208094,\n \"acc_stderr\": 0.03496101481191181,\n \"acc_norm\": 0.30057803468208094,\n \"acc_norm_stderr\": 0.03496101481191181\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.19,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.1829787234042553,\n \"acc_stderr\": 0.025276041000449966,\n \"acc_norm\": 0.1829787234042553,\n \"acc_norm_stderr\": 0.025276041000449966\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.21929824561403508,\n \"acc_stderr\": 0.03892431106518754,\n \"acc_norm\": 0.21929824561403508,\n \"acc_norm_stderr\": 0.03892431106518754\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.20689655172413793,\n \"acc_stderr\": 0.03375672449560554,\n \"acc_norm\": 0.20689655172413793,\n \"acc_norm_stderr\": 0.03375672449560554\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.022569897074918417,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.022569897074918417\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.15079365079365079,\n \"acc_stderr\": 0.03200686497287392,\n \"acc_norm\": 0.15079365079365079,\n \"acc_norm_stderr\": 0.03200686497287392\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3161290322580645,\n \"acc_stderr\": 0.02645087448904277,\n \"acc_norm\": 0.3161290322580645,\n \"acc_norm_stderr\": 0.02645087448904277\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.26108374384236455,\n \"acc_stderr\": 0.030903796952114468,\n \"acc_norm\": 0.26108374384236455,\n \"acc_norm_stderr\": 0.030903796952114468\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.17,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.17,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.2545454545454545,\n \"acc_stderr\": 0.03401506715249039,\n \"acc_norm\": 0.2545454545454545,\n \"acc_norm_stderr\": 0.03401506715249039\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.3383838383838384,\n \"acc_stderr\": 0.03371124142626302,\n \"acc_norm\": 0.3383838383838384,\n \"acc_norm_stderr\": 0.03371124142626302\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.33678756476683935,\n \"acc_stderr\": 0.03410780251836184,\n \"acc_norm\": 0.33678756476683935,\n \"acc_norm_stderr\": 0.03410780251836184\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.34102564102564104,\n \"acc_stderr\": 0.02403548967633507,\n \"acc_norm\": 0.34102564102564104,\n \"acc_norm_stderr\": 0.02403548967633507\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.26296296296296295,\n \"acc_stderr\": 0.026842057873833706,\n \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.026842057873833706\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.3445378151260504,\n \"acc_stderr\": 0.030868682604121633,\n \"acc_norm\": 0.3445378151260504,\n \"acc_norm_stderr\": 0.030868682604121633\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.23853211009174313,\n \"acc_stderr\": 0.01827257581023187,\n \"acc_norm\": 0.23853211009174313,\n \"acc_norm_stderr\": 0.01827257581023187\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.030190282453501947,\n \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.030190282453501947\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.2489451476793249,\n \"acc_stderr\": 0.028146970599422644,\n \"acc_norm\": 0.2489451476793249,\n \"acc_norm_stderr\": 0.028146970599422644\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.21076233183856502,\n \"acc_stderr\": 0.027373095500540193,\n \"acc_norm\": 0.21076233183856502,\n \"acc_norm_stderr\": 0.027373095500540193\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.25190839694656486,\n \"acc_stderr\": 0.03807387116306086,\n \"acc_norm\": 0.25190839694656486,\n \"acc_norm_stderr\": 0.03807387116306086\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.24793388429752067,\n \"acc_stderr\": 0.03941897526516303,\n \"acc_norm\": 0.24793388429752067,\n \"acc_norm_stderr\": 0.03941897526516303\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.18518518518518517,\n \"acc_stderr\": 0.03755265865037181,\n \"acc_norm\": 0.18518518518518517,\n \"acc_norm_stderr\": 0.03755265865037181\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.2331288343558282,\n \"acc_stderr\": 0.033220157957767414,\n \"acc_norm\": 0.2331288343558282,\n \"acc_norm_stderr\": 0.033220157957767414\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.1875,\n \"acc_stderr\": 0.0370468111477387,\n \"acc_norm\": 0.1875,\n \"acc_norm_stderr\": 0.0370468111477387\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.22330097087378642,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.22330097087378642,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.19658119658119658,\n \"acc_stderr\": 0.02603538609895129,\n \"acc_norm\": 0.19658119658119658,\n \"acc_norm_stderr\": 0.02603538609895129\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2835249042145594,\n \"acc_stderr\": 0.01611731816683228,\n \"acc_norm\": 0.2835249042145594,\n \"acc_norm_stderr\": 0.01611731816683228\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24134078212290502,\n \"acc_stderr\": 0.014310999547961438,\n \"acc_norm\": 0.24134078212290502,\n \"acc_norm_stderr\": 0.014310999547961438\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351298,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351298\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2990353697749196,\n \"acc_stderr\": 0.026003301117885135,\n \"acc_norm\": 0.2990353697749196,\n \"acc_norm_stderr\": 0.026003301117885135\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.24691358024691357,\n \"acc_stderr\": 0.02399350170904211,\n \"acc_norm\": 0.24691358024691357,\n \"acc_norm_stderr\": 0.02399350170904211\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.2553191489361702,\n \"acc_stderr\": 0.02601199293090201,\n \"acc_norm\": 0.2553191489361702,\n \"acc_norm_stderr\": 0.02601199293090201\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24511082138200782,\n \"acc_stderr\": 0.010986307870045517,\n \"acc_norm\": 0.24511082138200782,\n \"acc_norm_stderr\": 0.010986307870045517\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.029896163033125478,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.029896163033125478\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.017630827375148383,\n \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.017630827375148383\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.03831305140884603,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.03831305140884603\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.23673469387755103,\n \"acc_stderr\": 0.02721283588407316,\n \"acc_norm\": 0.23673469387755103,\n \"acc_norm_stderr\": 0.02721283588407316\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23880597014925373,\n \"acc_stderr\": 0.030147775935409224,\n \"acc_norm\": 0.23880597014925373,\n \"acc_norm_stderr\": 0.030147775935409224\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.25301204819277107,\n \"acc_stderr\": 0.033844291552331346,\n \"acc_norm\": 0.25301204819277107,\n \"acc_norm_stderr\": 0.033844291552331346\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.0312678171466318,\n \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.0312678171466318\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2741738066095471,\n \"mc1_stderr\": 0.015616518497219376,\n \"mc2\": 0.4726841055154348,\n \"mc2_stderr\": 0.015727848850119193\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5430149960536701,\n \"acc_stderr\": 0.01400038676159829\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/Felladrin/Llama-68M-Chat-v1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_14T17_25_12.605913", "path": ["**/details_harness|arc:challenge|25_2024-01-14T17-25-12.605913.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-14T17-25-12.605913.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_14T17_25_12.605913", "path": ["**/details_harness|gsm8k|5_2024-01-14T17-25-12.605913.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-14T17-25-12.605913.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_14T17_25_12.605913", "path": ["**/details_harness|hellaswag|10_2024-01-14T17-25-12.605913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-14T17-25-12.605913.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_14T17_25_12.605913", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T17-25-12.605913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T17-25-12.605913.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T17-25-12.605913.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_14T17_25_12.605913", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T17-25-12.605913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T17-25-12.605913.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_14T17_25_12.605913", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T17-25-12.605913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T17-25-12.605913.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_14T17_25_12.605913", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T17-25-12.605913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T17-25-12.605913.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_14T17_25_12.605913", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T17-25-12.605913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T17-25-12.605913.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_14T17_25_12.605913", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T17-25-12.605913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T17-25-12.605913.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_14T17_25_12.605913", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T17-25-12.605913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T17-25-12.605913.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_14T17_25_12.605913", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T17-25-12.605913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T17-25-12.605913.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_14T17_25_12.605913", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T17-25-12.605913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T17-25-12.605913.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_14T17_25_12.605913", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T17-25-12.605913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T17-25-12.605913.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_14T17_25_12.605913", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T17-25-12.605913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T17-25-12.605913.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_14T17_25_12.605913", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T17-25-12.605913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T17-25-12.605913.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_14T17_25_12.605913", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T17-25-12.605913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T17-25-12.605913.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_14T17_25_12.605913", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T17-25-12.605913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T17-25-12.605913.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_14T17_25_12.605913", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T17-25-12.605913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T17-25-12.605913.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_14T17_25_12.605913", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T17-25-12.605913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T17-25-12.605913.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_14T17_25_12.605913", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T17-25-12.605913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T17-25-12.605913.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_14T17_25_12.605913", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T17-25-12.605913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T17-25-12.605913.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_14T17_25_12.605913", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T17-25-12.605913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T17-25-12.605913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_14T17_25_12.605913", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T17-25-12.605913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T17-25-12.605913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_14T17_25_12.605913", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T17-25-12.605913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T17-25-12.605913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_14T17_25_12.605913", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T17-25-12.605913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T17-25-12.605913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_14T17_25_12.605913", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T17-25-12.605913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T17-25-12.605913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_14T17_25_12.605913", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T17-25-12.605913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T17-25-12.605913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_14T17_25_12.605913", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T17-25-12.605913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T17-25-12.605913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_14T17_25_12.605913", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T17-25-12.605913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T17-25-12.605913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_14T17_25_12.605913", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T17-25-12.605913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T17-25-12.605913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_14T17_25_12.605913", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T17-25-12.605913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T17-25-12.605913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_14T17_25_12.605913", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T17-25-12.605913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T17-25-12.605913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_14T17_25_12.605913", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T17-25-12.605913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T17-25-12.605913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_14T17_25_12.605913", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T17-25-12.605913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T17-25-12.605913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_14T17_25_12.605913", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T17-25-12.605913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T17-25-12.605913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_14T17_25_12.605913", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T17-25-12.605913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T17-25-12.605913.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_14T17_25_12.605913", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T17-25-12.605913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T17-25-12.605913.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_14T17_25_12.605913", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T17-25-12.605913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T17-25-12.605913.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_14T17_25_12.605913", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T17-25-12.605913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T17-25-12.605913.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_14T17_25_12.605913", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T17-25-12.605913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T17-25-12.605913.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_14T17_25_12.605913", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T17-25-12.605913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T17-25-12.605913.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_14T17_25_12.605913", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T17-25-12.605913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T17-25-12.605913.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_14T17_25_12.605913", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T17-25-12.605913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T17-25-12.605913.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_14T17_25_12.605913", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T17-25-12.605913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T17-25-12.605913.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_14T17_25_12.605913", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T17-25-12.605913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T17-25-12.605913.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_14T17_25_12.605913", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T17-25-12.605913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T17-25-12.605913.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_14T17_25_12.605913", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T17-25-12.605913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T17-25-12.605913.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_14T17_25_12.605913", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T17-25-12.605913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T17-25-12.605913.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_14T17_25_12.605913", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T17-25-12.605913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T17-25-12.605913.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_14T17_25_12.605913", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T17-25-12.605913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T17-25-12.605913.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_14T17_25_12.605913", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T17-25-12.605913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T17-25-12.605913.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_14T17_25_12.605913", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T17-25-12.605913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T17-25-12.605913.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_14T17_25_12.605913", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T17-25-12.605913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T17-25-12.605913.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_14T17_25_12.605913", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T17-25-12.605913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T17-25-12.605913.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_14T17_25_12.605913", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T17-25-12.605913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T17-25-12.605913.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_14T17_25_12.605913", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T17-25-12.605913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T17-25-12.605913.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_14T17_25_12.605913", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T17-25-12.605913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T17-25-12.605913.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_14T17_25_12.605913", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T17-25-12.605913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T17-25-12.605913.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_14T17_25_12.605913", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T17-25-12.605913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T17-25-12.605913.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_14T17_25_12.605913", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T17-25-12.605913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T17-25-12.605913.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_14T17_25_12.605913", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T17-25-12.605913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T17-25-12.605913.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_14T17_25_12.605913", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T17-25-12.605913.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T17-25-12.605913.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_14T17_25_12.605913", "path": ["**/details_harness|winogrande|5_2024-01-14T17-25-12.605913.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-14T17-25-12.605913.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_14T17_25_12.605913", "path": ["results_2024-01-14T17-25-12.605913.parquet"]}, {"split": "latest", "path": ["results_2024-01-14T17-25-12.605913.parquet"]}]}]} | 2024-01-14T17:27:23+00:00 |
be9393c81659b176efda35dfa5291d38920ff500 |
# Dataset Card for Evaluation run of AA051611/limb
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [AA051611/limb](https://huggingface.co/AA051611/limb) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_AA051611__limb",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T17:31:13.154923](https://huggingface.co/datasets/open-llm-leaderboard/details_AA051611__limb/blob/main/results_2024-01-14T17-31-13.154923.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7173948628205344,
"acc_stderr": 0.029795425890422344,
"acc_norm": 0.7228232912878558,
"acc_norm_stderr": 0.030359217292974663,
"mc1": 0.3990208078335373,
"mc1_stderr": 0.017142825728496767,
"mc2": 0.5836669238966421,
"mc2_stderr": 0.01521191071011394
},
"harness|arc:challenge|25": {
"acc": 0.5921501706484642,
"acc_stderr": 0.014361097288449712,
"acc_norm": 0.6348122866894198,
"acc_norm_stderr": 0.0140702655192688
},
"harness|hellaswag|10": {
"acc": 0.6357299342760406,
"acc_stderr": 0.0048024139199326675,
"acc_norm": 0.8307110137422824,
"acc_norm_stderr": 0.0037424055874098806
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6814814814814815,
"acc_stderr": 0.04024778401977108,
"acc_norm": 0.6814814814814815,
"acc_norm_stderr": 0.04024778401977108
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8157894736842105,
"acc_stderr": 0.0315469804508223,
"acc_norm": 0.8157894736842105,
"acc_norm_stderr": 0.0315469804508223
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7849056603773585,
"acc_stderr": 0.025288394502891363,
"acc_norm": 0.7849056603773585,
"acc_norm_stderr": 0.025288394502891363
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.03309615177059007,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.03309615177059007
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.45,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.45,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6936416184971098,
"acc_stderr": 0.0351494255126744,
"acc_norm": 0.6936416184971098,
"acc_norm_stderr": 0.0351494255126744
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287533,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287533
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036624,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036624
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.723404255319149,
"acc_stderr": 0.029241883869628813,
"acc_norm": 0.723404255319149,
"acc_norm_stderr": 0.029241883869628813
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5964912280701754,
"acc_stderr": 0.04615186962583707,
"acc_norm": 0.5964912280701754,
"acc_norm_stderr": 0.04615186962583707
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7034482758620689,
"acc_stderr": 0.03806142687309992,
"acc_norm": 0.7034482758620689,
"acc_norm_stderr": 0.03806142687309992
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.626984126984127,
"acc_stderr": 0.02490699045899257,
"acc_norm": 0.626984126984127,
"acc_norm_stderr": 0.02490699045899257
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5,
"acc_stderr": 0.04472135954999579,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04472135954999579
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8451612903225807,
"acc_stderr": 0.020579287326583227,
"acc_norm": 0.8451612903225807,
"acc_norm_stderr": 0.020579287326583227
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.034991131376767445,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.034991131376767445
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932261,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932261
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8363636363636363,
"acc_stderr": 0.02888787239548795,
"acc_norm": 0.8363636363636363,
"acc_norm_stderr": 0.02888787239548795
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9141414141414141,
"acc_stderr": 0.019960225563172885,
"acc_norm": 0.9141414141414141,
"acc_norm_stderr": 0.019960225563172885
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9326424870466321,
"acc_stderr": 0.018088393839078898,
"acc_norm": 0.9326424870466321,
"acc_norm_stderr": 0.018088393839078898
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.764102564102564,
"acc_stderr": 0.021525965407408726,
"acc_norm": 0.764102564102564,
"acc_norm_stderr": 0.021525965407408726
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.4111111111111111,
"acc_stderr": 0.029999923508706682,
"acc_norm": 0.4111111111111111,
"acc_norm_stderr": 0.029999923508706682
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7899159663865546,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.7899159663865546,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.45695364238410596,
"acc_stderr": 0.04067325174247443,
"acc_norm": 0.45695364238410596,
"acc_norm_stderr": 0.04067325174247443
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8880733944954129,
"acc_stderr": 0.013517352714958792,
"acc_norm": 0.8880733944954129,
"acc_norm_stderr": 0.013517352714958792
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6342592592592593,
"acc_stderr": 0.03284738857647206,
"acc_norm": 0.6342592592592593,
"acc_norm_stderr": 0.03284738857647206
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8921568627450981,
"acc_stderr": 0.021770522281368394,
"acc_norm": 0.8921568627450981,
"acc_norm_stderr": 0.021770522281368394
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8818565400843882,
"acc_stderr": 0.02101105265987846,
"acc_norm": 0.8818565400843882,
"acc_norm_stderr": 0.02101105265987846
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7399103139013453,
"acc_stderr": 0.029442495585857476,
"acc_norm": 0.7399103139013453,
"acc_norm_stderr": 0.029442495585857476
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8396946564885496,
"acc_stderr": 0.032178294207446306,
"acc_norm": 0.8396946564885496,
"acc_norm_stderr": 0.032178294207446306
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8842975206611571,
"acc_stderr": 0.0291998024556228,
"acc_norm": 0.8842975206611571,
"acc_norm_stderr": 0.0291998024556228
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8425925925925926,
"acc_stderr": 0.03520703990517963,
"acc_norm": 0.8425925925925926,
"acc_norm_stderr": 0.03520703990517963
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8098159509202454,
"acc_stderr": 0.03083349114628123,
"acc_norm": 0.8098159509202454,
"acc_norm_stderr": 0.03083349114628123
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5446428571428571,
"acc_stderr": 0.04726835553719098,
"acc_norm": 0.5446428571428571,
"acc_norm_stderr": 0.04726835553719098
},
"harness|hendrycksTest-management|5": {
"acc": 0.8737864077669902,
"acc_stderr": 0.03288180278808628,
"acc_norm": 0.8737864077669902,
"acc_norm_stderr": 0.03288180278808628
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9188034188034188,
"acc_stderr": 0.017893784904018543,
"acc_norm": 0.9188034188034188,
"acc_norm_stderr": 0.017893784904018543
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.85,
"acc_stderr": 0.035887028128263714,
"acc_norm": 0.85,
"acc_norm_stderr": 0.035887028128263714
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.896551724137931,
"acc_stderr": 0.0108904525446915,
"acc_norm": 0.896551724137931,
"acc_norm_stderr": 0.0108904525446915
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7514450867052023,
"acc_stderr": 0.02326752843210017,
"acc_norm": 0.7514450867052023,
"acc_norm_stderr": 0.02326752843210017
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.42569832402234636,
"acc_stderr": 0.016536829648997112,
"acc_norm": 0.42569832402234636,
"acc_norm_stderr": 0.016536829648997112
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8169934640522876,
"acc_stderr": 0.022140767512880973,
"acc_norm": 0.8169934640522876,
"acc_norm_stderr": 0.022140767512880973
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7813504823151125,
"acc_stderr": 0.02347558141786111,
"acc_norm": 0.7813504823151125,
"acc_norm_stderr": 0.02347558141786111
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8117283950617284,
"acc_stderr": 0.02175186606081587,
"acc_norm": 0.8117283950617284,
"acc_norm_stderr": 0.02175186606081587
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5283687943262412,
"acc_stderr": 0.029779450957303062,
"acc_norm": 0.5283687943262412,
"acc_norm_stderr": 0.029779450957303062
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.529986962190352,
"acc_stderr": 0.012747248967079058,
"acc_norm": 0.529986962190352,
"acc_norm_stderr": 0.012747248967079058
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.025767252010855946,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.025767252010855946
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7630718954248366,
"acc_stderr": 0.01720166216978978,
"acc_norm": 0.7630718954248366,
"acc_norm_stderr": 0.01720166216978978
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7363636363636363,
"acc_stderr": 0.04220224692971987,
"acc_norm": 0.7363636363636363,
"acc_norm_stderr": 0.04220224692971987
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7836734693877551,
"acc_stderr": 0.026358916334904028,
"acc_norm": 0.7836734693877551,
"acc_norm_stderr": 0.026358916334904028
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8706467661691543,
"acc_stderr": 0.023729830881018526,
"acc_norm": 0.8706467661691543,
"acc_norm_stderr": 0.023729830881018526
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.95,
"acc_stderr": 0.021904291355759033,
"acc_norm": 0.95,
"acc_norm_stderr": 0.021904291355759033
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8771929824561403,
"acc_stderr": 0.02517298435015577,
"acc_norm": 0.8771929824561403,
"acc_norm_stderr": 0.02517298435015577
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3990208078335373,
"mc1_stderr": 0.017142825728496767,
"mc2": 0.5836669238966421,
"mc2_stderr": 0.01521191071011394
},
"harness|winogrande|5": {
"acc": 0.797947908445146,
"acc_stderr": 0.01128501375404745
},
"harness|gsm8k|5": {
"acc": 0.55420773313116,
"acc_stderr": 0.013691305174506698
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_AA051611__limb | [
"region:us"
] | 2024-01-14T17:33:21+00:00 | {"pretty_name": "Evaluation run of AA051611/limb", "dataset_summary": "Dataset automatically created during the evaluation run of model [AA051611/limb](https://huggingface.co/AA051611/limb) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AA051611__limb\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-14T17:31:13.154923](https://huggingface.co/datasets/open-llm-leaderboard/details_AA051611__limb/blob/main/results_2024-01-14T17-31-13.154923.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7173948628205344,\n \"acc_stderr\": 0.029795425890422344,\n \"acc_norm\": 0.7228232912878558,\n \"acc_norm_stderr\": 0.030359217292974663,\n \"mc1\": 0.3990208078335373,\n \"mc1_stderr\": 0.017142825728496767,\n \"mc2\": 0.5836669238966421,\n \"mc2_stderr\": 0.01521191071011394\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5921501706484642,\n \"acc_stderr\": 0.014361097288449712,\n \"acc_norm\": 0.6348122866894198,\n \"acc_norm_stderr\": 0.0140702655192688\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6357299342760406,\n \"acc_stderr\": 0.0048024139199326675,\n \"acc_norm\": 0.8307110137422824,\n \"acc_norm_stderr\": 0.0037424055874098806\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6814814814814815,\n \"acc_stderr\": 0.04024778401977108,\n \"acc_norm\": 0.6814814814814815,\n \"acc_norm_stderr\": 0.04024778401977108\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.8157894736842105,\n \"acc_stderr\": 0.0315469804508223,\n \"acc_norm\": 0.8157894736842105,\n \"acc_norm_stderr\": 0.0315469804508223\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7849056603773585,\n \"acc_stderr\": 0.025288394502891363,\n \"acc_norm\": 0.7849056603773585,\n \"acc_norm_stderr\": 0.025288394502891363\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.03309615177059007,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.03309615177059007\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.04999999999999999,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.04999999999999999\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6936416184971098,\n \"acc_stderr\": 0.0351494255126744,\n \"acc_norm\": 0.6936416184971098,\n \"acc_norm_stderr\": 0.0351494255126744\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287533,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287533\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036624,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036624\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.723404255319149,\n \"acc_stderr\": 0.029241883869628813,\n \"acc_norm\": 0.723404255319149,\n \"acc_norm_stderr\": 0.029241883869628813\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5964912280701754,\n \"acc_stderr\": 0.04615186962583707,\n \"acc_norm\": 0.5964912280701754,\n \"acc_norm_stderr\": 0.04615186962583707\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7034482758620689,\n \"acc_stderr\": 0.03806142687309992,\n \"acc_norm\": 0.7034482758620689,\n \"acc_norm_stderr\": 0.03806142687309992\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.626984126984127,\n \"acc_stderr\": 0.02490699045899257,\n \"acc_norm\": 0.626984126984127,\n \"acc_norm_stderr\": 0.02490699045899257\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04472135954999579,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04472135954999579\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8451612903225807,\n \"acc_stderr\": 0.020579287326583227,\n \"acc_norm\": 0.8451612903225807,\n \"acc_norm_stderr\": 0.020579287326583227\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.034991131376767445,\n \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.034991131376767445\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932261,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932261\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8363636363636363,\n \"acc_stderr\": 0.02888787239548795,\n \"acc_norm\": 0.8363636363636363,\n \"acc_norm_stderr\": 0.02888787239548795\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9141414141414141,\n \"acc_stderr\": 0.019960225563172885,\n \"acc_norm\": 0.9141414141414141,\n \"acc_norm_stderr\": 0.019960225563172885\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9326424870466321,\n \"acc_stderr\": 0.018088393839078898,\n \"acc_norm\": 0.9326424870466321,\n \"acc_norm_stderr\": 0.018088393839078898\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.764102564102564,\n \"acc_stderr\": 0.021525965407408726,\n \"acc_norm\": 0.764102564102564,\n \"acc_norm_stderr\": 0.021525965407408726\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.4111111111111111,\n \"acc_stderr\": 0.029999923508706682,\n \"acc_norm\": 0.4111111111111111,\n \"acc_norm_stderr\": 0.029999923508706682\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7899159663865546,\n \"acc_stderr\": 0.026461398717471874,\n \"acc_norm\": 0.7899159663865546,\n \"acc_norm_stderr\": 0.026461398717471874\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.45695364238410596,\n \"acc_stderr\": 0.04067325174247443,\n \"acc_norm\": 0.45695364238410596,\n \"acc_norm_stderr\": 0.04067325174247443\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8880733944954129,\n \"acc_stderr\": 0.013517352714958792,\n \"acc_norm\": 0.8880733944954129,\n \"acc_norm_stderr\": 0.013517352714958792\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6342592592592593,\n \"acc_stderr\": 0.03284738857647206,\n \"acc_norm\": 0.6342592592592593,\n \"acc_norm_stderr\": 0.03284738857647206\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8921568627450981,\n \"acc_stderr\": 0.021770522281368394,\n \"acc_norm\": 0.8921568627450981,\n \"acc_norm_stderr\": 0.021770522281368394\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8818565400843882,\n \"acc_stderr\": 0.02101105265987846,\n \"acc_norm\": 0.8818565400843882,\n \"acc_norm_stderr\": 0.02101105265987846\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7399103139013453,\n \"acc_stderr\": 0.029442495585857476,\n \"acc_norm\": 0.7399103139013453,\n \"acc_norm_stderr\": 0.029442495585857476\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8396946564885496,\n \"acc_stderr\": 0.032178294207446306,\n \"acc_norm\": 0.8396946564885496,\n \"acc_norm_stderr\": 0.032178294207446306\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8842975206611571,\n \"acc_stderr\": 0.0291998024556228,\n \"acc_norm\": 0.8842975206611571,\n \"acc_norm_stderr\": 0.0291998024556228\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8425925925925926,\n \"acc_stderr\": 0.03520703990517963,\n \"acc_norm\": 0.8425925925925926,\n \"acc_norm_stderr\": 0.03520703990517963\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8098159509202454,\n \"acc_stderr\": 0.03083349114628123,\n \"acc_norm\": 0.8098159509202454,\n \"acc_norm_stderr\": 0.03083349114628123\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5446428571428571,\n \"acc_stderr\": 0.04726835553719098,\n \"acc_norm\": 0.5446428571428571,\n \"acc_norm_stderr\": 0.04726835553719098\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8737864077669902,\n \"acc_stderr\": 0.03288180278808628,\n \"acc_norm\": 0.8737864077669902,\n \"acc_norm_stderr\": 0.03288180278808628\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9188034188034188,\n \"acc_stderr\": 0.017893784904018543,\n \"acc_norm\": 0.9188034188034188,\n \"acc_norm_stderr\": 0.017893784904018543\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.035887028128263714,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.035887028128263714\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.896551724137931,\n \"acc_stderr\": 0.0108904525446915,\n \"acc_norm\": 0.896551724137931,\n \"acc_norm_stderr\": 0.0108904525446915\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7514450867052023,\n \"acc_stderr\": 0.02326752843210017,\n \"acc_norm\": 0.7514450867052023,\n \"acc_norm_stderr\": 0.02326752843210017\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.42569832402234636,\n \"acc_stderr\": 0.016536829648997112,\n \"acc_norm\": 0.42569832402234636,\n \"acc_norm_stderr\": 0.016536829648997112\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8169934640522876,\n \"acc_stderr\": 0.022140767512880973,\n \"acc_norm\": 0.8169934640522876,\n \"acc_norm_stderr\": 0.022140767512880973\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7813504823151125,\n \"acc_stderr\": 0.02347558141786111,\n \"acc_norm\": 0.7813504823151125,\n \"acc_norm_stderr\": 0.02347558141786111\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8117283950617284,\n \"acc_stderr\": 0.02175186606081587,\n \"acc_norm\": 0.8117283950617284,\n \"acc_norm_stderr\": 0.02175186606081587\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5283687943262412,\n \"acc_stderr\": 0.029779450957303062,\n \"acc_norm\": 0.5283687943262412,\n \"acc_norm_stderr\": 0.029779450957303062\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.529986962190352,\n \"acc_stderr\": 0.012747248967079058,\n \"acc_norm\": 0.529986962190352,\n \"acc_norm_stderr\": 0.012747248967079058\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.025767252010855946,\n \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.025767252010855946\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.7630718954248366,\n \"acc_stderr\": 0.01720166216978978,\n \"acc_norm\": 0.7630718954248366,\n \"acc_norm_stderr\": 0.01720166216978978\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7363636363636363,\n \"acc_stderr\": 0.04220224692971987,\n \"acc_norm\": 0.7363636363636363,\n \"acc_norm_stderr\": 0.04220224692971987\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7836734693877551,\n \"acc_stderr\": 0.026358916334904028,\n \"acc_norm\": 0.7836734693877551,\n \"acc_norm_stderr\": 0.026358916334904028\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8706467661691543,\n \"acc_stderr\": 0.023729830881018526,\n \"acc_norm\": 0.8706467661691543,\n \"acc_norm_stderr\": 0.023729830881018526\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.95,\n \"acc_stderr\": 0.021904291355759033,\n \"acc_norm\": 0.95,\n \"acc_norm_stderr\": 0.021904291355759033\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8771929824561403,\n \"acc_stderr\": 0.02517298435015577,\n \"acc_norm\": 0.8771929824561403,\n \"acc_norm_stderr\": 0.02517298435015577\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3990208078335373,\n \"mc1_stderr\": 0.017142825728496767,\n \"mc2\": 0.5836669238966421,\n \"mc2_stderr\": 0.01521191071011394\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.797947908445146,\n \"acc_stderr\": 0.01128501375404745\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.55420773313116,\n \"acc_stderr\": 0.013691305174506698\n }\n}\n```", "repo_url": "https://huggingface.co/AA051611/limb", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_14T17_31_13.154923", "path": ["**/details_harness|arc:challenge|25_2024-01-14T17-31-13.154923.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-14T17-31-13.154923.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_14T17_31_13.154923", "path": ["**/details_harness|gsm8k|5_2024-01-14T17-31-13.154923.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-14T17-31-13.154923.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_14T17_31_13.154923", "path": ["**/details_harness|hellaswag|10_2024-01-14T17-31-13.154923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-14T17-31-13.154923.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_14T17_31_13.154923", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T17-31-13.154923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T17-31-13.154923.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T17-31-13.154923.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_14T17_31_13.154923", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T17-31-13.154923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T17-31-13.154923.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_14T17_31_13.154923", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T17-31-13.154923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T17-31-13.154923.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_14T17_31_13.154923", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T17-31-13.154923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T17-31-13.154923.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_14T17_31_13.154923", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T17-31-13.154923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T17-31-13.154923.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_14T17_31_13.154923", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T17-31-13.154923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T17-31-13.154923.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_14T17_31_13.154923", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T17-31-13.154923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T17-31-13.154923.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_14T17_31_13.154923", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T17-31-13.154923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T17-31-13.154923.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_14T17_31_13.154923", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T17-31-13.154923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T17-31-13.154923.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_14T17_31_13.154923", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T17-31-13.154923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T17-31-13.154923.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_14T17_31_13.154923", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T17-31-13.154923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T17-31-13.154923.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_14T17_31_13.154923", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T17-31-13.154923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T17-31-13.154923.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_14T17_31_13.154923", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T17-31-13.154923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T17-31-13.154923.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_14T17_31_13.154923", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T17-31-13.154923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T17-31-13.154923.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_14T17_31_13.154923", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T17-31-13.154923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T17-31-13.154923.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_14T17_31_13.154923", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T17-31-13.154923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T17-31-13.154923.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_14T17_31_13.154923", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T17-31-13.154923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T17-31-13.154923.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_14T17_31_13.154923", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T17-31-13.154923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T17-31-13.154923.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_14T17_31_13.154923", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T17-31-13.154923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T17-31-13.154923.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_14T17_31_13.154923", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T17-31-13.154923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T17-31-13.154923.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_14T17_31_13.154923", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T17-31-13.154923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T17-31-13.154923.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_14T17_31_13.154923", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T17-31-13.154923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T17-31-13.154923.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_14T17_31_13.154923", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T17-31-13.154923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T17-31-13.154923.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_14T17_31_13.154923", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T17-31-13.154923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T17-31-13.154923.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_14T17_31_13.154923", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T17-31-13.154923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T17-31-13.154923.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_14T17_31_13.154923", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T17-31-13.154923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T17-31-13.154923.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_14T17_31_13.154923", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T17-31-13.154923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T17-31-13.154923.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_14T17_31_13.154923", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T17-31-13.154923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T17-31-13.154923.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_14T17_31_13.154923", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T17-31-13.154923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T17-31-13.154923.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_14T17_31_13.154923", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T17-31-13.154923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T17-31-13.154923.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_14T17_31_13.154923", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T17-31-13.154923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T17-31-13.154923.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_14T17_31_13.154923", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T17-31-13.154923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T17-31-13.154923.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_14T17_31_13.154923", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T17-31-13.154923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T17-31-13.154923.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_14T17_31_13.154923", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T17-31-13.154923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T17-31-13.154923.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_14T17_31_13.154923", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T17-31-13.154923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T17-31-13.154923.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_14T17_31_13.154923", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T17-31-13.154923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T17-31-13.154923.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_14T17_31_13.154923", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T17-31-13.154923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T17-31-13.154923.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_14T17_31_13.154923", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T17-31-13.154923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T17-31-13.154923.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_14T17_31_13.154923", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T17-31-13.154923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T17-31-13.154923.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_14T17_31_13.154923", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T17-31-13.154923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T17-31-13.154923.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_14T17_31_13.154923", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T17-31-13.154923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T17-31-13.154923.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_14T17_31_13.154923", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T17-31-13.154923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T17-31-13.154923.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_14T17_31_13.154923", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T17-31-13.154923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T17-31-13.154923.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_14T17_31_13.154923", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T17-31-13.154923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T17-31-13.154923.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_14T17_31_13.154923", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T17-31-13.154923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T17-31-13.154923.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_14T17_31_13.154923", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T17-31-13.154923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T17-31-13.154923.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_14T17_31_13.154923", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T17-31-13.154923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T17-31-13.154923.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_14T17_31_13.154923", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T17-31-13.154923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T17-31-13.154923.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_14T17_31_13.154923", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T17-31-13.154923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T17-31-13.154923.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_14T17_31_13.154923", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T17-31-13.154923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T17-31-13.154923.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_14T17_31_13.154923", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T17-31-13.154923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T17-31-13.154923.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_14T17_31_13.154923", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T17-31-13.154923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T17-31-13.154923.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_14T17_31_13.154923", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T17-31-13.154923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T17-31-13.154923.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_14T17_31_13.154923", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T17-31-13.154923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T17-31-13.154923.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_14T17_31_13.154923", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T17-31-13.154923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T17-31-13.154923.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_14T17_31_13.154923", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T17-31-13.154923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T17-31-13.154923.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_14T17_31_13.154923", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T17-31-13.154923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T17-31-13.154923.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_14T17_31_13.154923", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T17-31-13.154923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T17-31-13.154923.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_14T17_31_13.154923", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T17-31-13.154923.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T17-31-13.154923.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_14T17_31_13.154923", "path": ["**/details_harness|winogrande|5_2024-01-14T17-31-13.154923.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-14T17-31-13.154923.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_14T17_31_13.154923", "path": ["results_2024-01-14T17-31-13.154923.parquet"]}, {"split": "latest", "path": ["results_2024-01-14T17-31-13.154923.parquet"]}]}]} | 2024-01-14T17:33:43+00:00 |
c1f99028e9de05cfae47d6ae5865ebd7b2c742c0 | rjds0207/pabloalboran | [
"region:us"
] | 2024-01-14T17:36:48+00:00 | {} | 2024-01-14T17:37:47+00:00 |
|
460e7a581b5b8d02abc21e39d09c0cb8103c837b | # Promoter Sequences and Corresponding Gene Expression data for Maize NAM lines
The data in this dataset has the promoter sequences and the corresponding gene expression data as TPM values for **26 Maize NAM lines** and has been used for the finetuning step *(for the downstream task of gene expression prediction)* of [`Florabert`](https://huggingface.co/Gurveer05/FloraBERT).
The data has been split into train, test and eval data (70-20-10 split). In all, there are ~ 7,00,000 entries across the files. The steps followed to obtain this data are available in this [`Github Repository`](https://github.com/gurveervirk/florabert).
The labels correspond to the TPM values for the various tissues in the order: [
'tassel',
'base',
'anther',
'middle',
'ear',
'shoot',
'tip',
'root'
]. The sequences that have been used are the promoter sequences for genes of Maize NAM lines that have at least 1 TPM value for a tissue > 1. | Gurveer05/maize-nam-gene-expression-data | [
"size_categories:100K<n<1M",
"biology",
"DNA",
"Gene Expression",
"region:us"
] | 2024-01-14T17:37:20+00:00 | {"size_categories": ["100K<n<1M"], "tags": ["biology", "DNA", "Gene Expression"]} | 2024-01-14T18:19:05+00:00 |
2288120a7c13ff4ab40cf6de8e5a2e237c723f3d |
# Dataset Card for Evaluation run of huangyt/Mistral-7B-v0.1-Open-Platypus_2.5w-r16-gate_up_down
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [huangyt/Mistral-7B-v0.1-Open-Platypus_2.5w-r16-gate_up_down](https://huggingface.co/huangyt/Mistral-7B-v0.1-Open-Platypus_2.5w-r16-gate_up_down) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_huangyt__Mistral-7B-v0.1-Open-Platypus_2.5w-r16-gate_up_down",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T17:36:45.221009](https://huggingface.co/datasets/open-llm-leaderboard/details_huangyt__Mistral-7B-v0.1-Open-Platypus_2.5w-r16-gate_up_down/blob/main/results_2024-01-14T17-36-45.221009.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6355178040599482,
"acc_stderr": 0.03241610229663876,
"acc_norm": 0.641571442422577,
"acc_norm_stderr": 0.033065020971592085,
"mc1": 0.3047735618115055,
"mc1_stderr": 0.016114124156882452,
"mc2": 0.45435317672164416,
"mc2_stderr": 0.014528686611193308
},
"harness|arc:challenge|25": {
"acc": 0.5665529010238908,
"acc_stderr": 0.014481376224558902,
"acc_norm": 0.6126279863481229,
"acc_norm_stderr": 0.014235872487909872
},
"harness|hellaswag|10": {
"acc": 0.6271659032065325,
"acc_stderr": 0.004825702533920412,
"acc_norm": 0.8319059948217487,
"acc_norm_stderr": 0.0037318549570309373
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6792452830188679,
"acc_stderr": 0.02872750295788027,
"acc_norm": 0.6792452830188679,
"acc_norm_stderr": 0.02872750295788027
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7013888888888888,
"acc_stderr": 0.03827052357950756,
"acc_norm": 0.7013888888888888,
"acc_norm_stderr": 0.03827052357950756
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.04692008381368909,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.04692008381368909
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.43386243386243384,
"acc_stderr": 0.025525034382474884,
"acc_norm": 0.43386243386243384,
"acc_norm_stderr": 0.025525034382474884
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7451612903225806,
"acc_stderr": 0.024790118459332208,
"acc_norm": 0.7451612903225806,
"acc_norm_stderr": 0.024790118459332208
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.035145285621750094,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.035145285621750094
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.65,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.03008862949021749,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.03008862949021749
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8601036269430051,
"acc_stderr": 0.025033870583015184,
"acc_norm": 0.8601036269430051,
"acc_norm_stderr": 0.025033870583015184
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6282051282051282,
"acc_stderr": 0.024503472557110936,
"acc_norm": 0.6282051282051282,
"acc_norm_stderr": 0.024503472557110936
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.36666666666666664,
"acc_stderr": 0.029381620726465076,
"acc_norm": 0.36666666666666664,
"acc_norm_stderr": 0.029381620726465076
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6596638655462185,
"acc_stderr": 0.030778057422931673,
"acc_norm": 0.6596638655462185,
"acc_norm_stderr": 0.030778057422931673
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8348623853211009,
"acc_stderr": 0.015919557829976044,
"acc_norm": 0.8348623853211009,
"acc_norm_stderr": 0.015919557829976044
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7843137254901961,
"acc_stderr": 0.02886743144984932,
"acc_norm": 0.7843137254901961,
"acc_norm_stderr": 0.02886743144984932
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601453,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601453
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.031381476375754995,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.031381476375754995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8349514563106796,
"acc_stderr": 0.036756688322331886,
"acc_norm": 0.8349514563106796,
"acc_norm_stderr": 0.036756688322331886
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.023086635086841407,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.023086635086841407
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8186462324393359,
"acc_stderr": 0.013778693778464085,
"acc_norm": 0.8186462324393359,
"acc_norm_stderr": 0.013778693778464085
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.02378620325550829,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.02378620325550829
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.38212290502793295,
"acc_stderr": 0.016251139711570762,
"acc_norm": 0.38212290502793295,
"acc_norm_stderr": 0.016251139711570762
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.761437908496732,
"acc_stderr": 0.024404394928087873,
"acc_norm": 0.761437908496732,
"acc_norm_stderr": 0.024404394928087873
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.025583062489984813,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.025583062489984813
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7438271604938271,
"acc_stderr": 0.024288533637726095,
"acc_norm": 0.7438271604938271,
"acc_norm_stderr": 0.024288533637726095
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.029820747191422473,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.029820747191422473
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46870925684485004,
"acc_stderr": 0.012745204626083143,
"acc_norm": 0.46870925684485004,
"acc_norm_stderr": 0.012745204626083143
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.02833295951403121,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.02833295951403121
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6633986928104575,
"acc_stderr": 0.019117213911495155,
"acc_norm": 0.6633986928104575,
"acc_norm_stderr": 0.019117213911495155
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.689795918367347,
"acc_stderr": 0.029613459872484378,
"acc_norm": 0.689795918367347,
"acc_norm_stderr": 0.029613459872484378
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454132,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454132
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710905,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710905
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.02954774168764004,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.02954774168764004
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3047735618115055,
"mc1_stderr": 0.016114124156882452,
"mc2": 0.45435317672164416,
"mc2_stderr": 0.014528686611193308
},
"harness|winogrande|5": {
"acc": 0.7734806629834254,
"acc_stderr": 0.011764149054698332
},
"harness|gsm8k|5": {
"acc": 0.3912054586808188,
"acc_stderr": 0.0134425024027943
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_huangyt__Mistral-7B-v0.1-Open-Platypus_2.5w-r16-gate_up_down | [
"region:us"
] | 2024-01-14T17:39:04+00:00 | {"pretty_name": "Evaluation run of huangyt/Mistral-7B-v0.1-Open-Platypus_2.5w-r16-gate_up_down", "dataset_summary": "Dataset automatically created during the evaluation run of model [huangyt/Mistral-7B-v0.1-Open-Platypus_2.5w-r16-gate_up_down](https://huggingface.co/huangyt/Mistral-7B-v0.1-Open-Platypus_2.5w-r16-gate_up_down) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_huangyt__Mistral-7B-v0.1-Open-Platypus_2.5w-r16-gate_up_down\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-14T17:36:45.221009](https://huggingface.co/datasets/open-llm-leaderboard/details_huangyt__Mistral-7B-v0.1-Open-Platypus_2.5w-r16-gate_up_down/blob/main/results_2024-01-14T17-36-45.221009.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6355178040599482,\n \"acc_stderr\": 0.03241610229663876,\n \"acc_norm\": 0.641571442422577,\n \"acc_norm_stderr\": 0.033065020971592085,\n \"mc1\": 0.3047735618115055,\n \"mc1_stderr\": 0.016114124156882452,\n \"mc2\": 0.45435317672164416,\n \"mc2_stderr\": 0.014528686611193308\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5665529010238908,\n \"acc_stderr\": 0.014481376224558902,\n \"acc_norm\": 0.6126279863481229,\n \"acc_norm_stderr\": 0.014235872487909872\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6271659032065325,\n \"acc_stderr\": 0.004825702533920412,\n \"acc_norm\": 0.8319059948217487,\n \"acc_norm_stderr\": 0.0037318549570309373\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.02872750295788027,\n \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.02872750295788027\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7013888888888888,\n \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.7013888888888888,\n \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.04692008381368909,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.04692008381368909\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.43386243386243384,\n \"acc_stderr\": 0.025525034382474884,\n \"acc_norm\": 0.43386243386243384,\n \"acc_norm_stderr\": 0.025525034382474884\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7451612903225806,\n \"acc_stderr\": 0.024790118459332208,\n \"acc_norm\": 0.7451612903225806,\n \"acc_norm_stderr\": 0.024790118459332208\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.035145285621750094,\n \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.035145285621750094\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.04793724854411019,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.04793724854411019\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7676767676767676,\n \"acc_stderr\": 0.03008862949021749,\n \"acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.03008862949021749\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.025033870583015184,\n \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.025033870583015184\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6282051282051282,\n \"acc_stderr\": 0.024503472557110936,\n \"acc_norm\": 0.6282051282051282,\n \"acc_norm_stderr\": 0.024503472557110936\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.36666666666666664,\n \"acc_stderr\": 0.029381620726465076,\n \"acc_norm\": 0.36666666666666664,\n \"acc_norm_stderr\": 0.029381620726465076\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6596638655462185,\n \"acc_stderr\": 0.030778057422931673,\n \"acc_norm\": 0.6596638655462185,\n \"acc_norm_stderr\": 0.030778057422931673\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8348623853211009,\n \"acc_stderr\": 0.015919557829976044,\n \"acc_norm\": 0.8348623853211009,\n \"acc_norm_stderr\": 0.015919557829976044\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7843137254901961,\n \"acc_stderr\": 0.02886743144984932,\n \"acc_norm\": 0.7843137254901961,\n \"acc_norm_stderr\": 0.02886743144984932\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601453,\n \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601453\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n \"acc_stderr\": 0.031381476375754995,\n \"acc_norm\": 0.6771300448430493,\n \"acc_norm_stderr\": 0.031381476375754995\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8349514563106796,\n \"acc_stderr\": 0.036756688322331886,\n \"acc_norm\": 0.8349514563106796,\n \"acc_norm_stderr\": 0.036756688322331886\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n \"acc_stderr\": 0.023086635086841407,\n \"acc_norm\": 0.8547008547008547,\n \"acc_norm_stderr\": 0.023086635086841407\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8186462324393359,\n \"acc_stderr\": 0.013778693778464085,\n \"acc_norm\": 0.8186462324393359,\n \"acc_norm_stderr\": 0.013778693778464085\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.02378620325550829,\n \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.02378620325550829\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.38212290502793295,\n \"acc_stderr\": 0.016251139711570762,\n \"acc_norm\": 0.38212290502793295,\n \"acc_norm_stderr\": 0.016251139711570762\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.761437908496732,\n \"acc_stderr\": 0.024404394928087873,\n \"acc_norm\": 0.761437908496732,\n \"acc_norm_stderr\": 0.024404394928087873\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n \"acc_stderr\": 0.025583062489984813,\n \"acc_norm\": 0.7170418006430869,\n \"acc_norm_stderr\": 0.025583062489984813\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.024288533637726095,\n \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.024288533637726095\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422473,\n \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422473\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46870925684485004,\n \"acc_stderr\": 0.012745204626083143,\n \"acc_norm\": 0.46870925684485004,\n \"acc_norm_stderr\": 0.012745204626083143\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.02833295951403121,\n \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.02833295951403121\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6633986928104575,\n \"acc_stderr\": 0.019117213911495155,\n \"acc_norm\": 0.6633986928104575,\n \"acc_norm_stderr\": 0.019117213911495155\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.689795918367347,\n \"acc_stderr\": 0.029613459872484378,\n \"acc_norm\": 0.689795918367347,\n \"acc_norm_stderr\": 0.029613459872484378\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454132,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454132\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710905,\n \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710905\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.02954774168764004,\n \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.02954774168764004\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3047735618115055,\n \"mc1_stderr\": 0.016114124156882452,\n \"mc2\": 0.45435317672164416,\n \"mc2_stderr\": 0.014528686611193308\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7734806629834254,\n \"acc_stderr\": 0.011764149054698332\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3912054586808188,\n \"acc_stderr\": 0.0134425024027943\n }\n}\n```", "repo_url": "https://huggingface.co/huangyt/Mistral-7B-v0.1-Open-Platypus_2.5w-r16-gate_up_down", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_14T17_36_45.221009", "path": ["**/details_harness|arc:challenge|25_2024-01-14T17-36-45.221009.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-14T17-36-45.221009.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_14T17_36_45.221009", "path": ["**/details_harness|gsm8k|5_2024-01-14T17-36-45.221009.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-14T17-36-45.221009.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_14T17_36_45.221009", "path": ["**/details_harness|hellaswag|10_2024-01-14T17-36-45.221009.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-14T17-36-45.221009.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_14T17_36_45.221009", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T17-36-45.221009.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T17-36-45.221009.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T17-36-45.221009.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_14T17_36_45.221009", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T17-36-45.221009.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T17-36-45.221009.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_14T17_36_45.221009", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T17-36-45.221009.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T17-36-45.221009.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_14T17_36_45.221009", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T17-36-45.221009.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T17-36-45.221009.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_14T17_36_45.221009", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T17-36-45.221009.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T17-36-45.221009.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_14T17_36_45.221009", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T17-36-45.221009.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T17-36-45.221009.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_14T17_36_45.221009", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T17-36-45.221009.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T17-36-45.221009.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_14T17_36_45.221009", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T17-36-45.221009.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T17-36-45.221009.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_14T17_36_45.221009", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T17-36-45.221009.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T17-36-45.221009.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_14T17_36_45.221009", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T17-36-45.221009.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T17-36-45.221009.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_14T17_36_45.221009", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T17-36-45.221009.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T17-36-45.221009.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_14T17_36_45.221009", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T17-36-45.221009.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T17-36-45.221009.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_14T17_36_45.221009", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T17-36-45.221009.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T17-36-45.221009.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_14T17_36_45.221009", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T17-36-45.221009.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T17-36-45.221009.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_14T17_36_45.221009", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T17-36-45.221009.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T17-36-45.221009.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_14T17_36_45.221009", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T17-36-45.221009.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T17-36-45.221009.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_14T17_36_45.221009", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T17-36-45.221009.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T17-36-45.221009.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_14T17_36_45.221009", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T17-36-45.221009.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T17-36-45.221009.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_14T17_36_45.221009", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T17-36-45.221009.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T17-36-45.221009.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_14T17_36_45.221009", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T17-36-45.221009.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T17-36-45.221009.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_14T17_36_45.221009", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T17-36-45.221009.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T17-36-45.221009.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_14T17_36_45.221009", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T17-36-45.221009.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T17-36-45.221009.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_14T17_36_45.221009", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T17-36-45.221009.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T17-36-45.221009.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_14T17_36_45.221009", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T17-36-45.221009.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T17-36-45.221009.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_14T17_36_45.221009", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T17-36-45.221009.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T17-36-45.221009.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_14T17_36_45.221009", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T17-36-45.221009.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T17-36-45.221009.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_14T17_36_45.221009", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T17-36-45.221009.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T17-36-45.221009.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_14T17_36_45.221009", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T17-36-45.221009.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T17-36-45.221009.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_14T17_36_45.221009", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T17-36-45.221009.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T17-36-45.221009.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_14T17_36_45.221009", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T17-36-45.221009.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T17-36-45.221009.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_14T17_36_45.221009", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T17-36-45.221009.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T17-36-45.221009.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_14T17_36_45.221009", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T17-36-45.221009.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T17-36-45.221009.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_14T17_36_45.221009", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T17-36-45.221009.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T17-36-45.221009.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_14T17_36_45.221009", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T17-36-45.221009.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T17-36-45.221009.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_14T17_36_45.221009", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T17-36-45.221009.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T17-36-45.221009.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_14T17_36_45.221009", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T17-36-45.221009.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T17-36-45.221009.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_14T17_36_45.221009", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T17-36-45.221009.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T17-36-45.221009.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_14T17_36_45.221009", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T17-36-45.221009.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T17-36-45.221009.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_14T17_36_45.221009", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T17-36-45.221009.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T17-36-45.221009.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_14T17_36_45.221009", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T17-36-45.221009.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T17-36-45.221009.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_14T17_36_45.221009", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T17-36-45.221009.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T17-36-45.221009.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_14T17_36_45.221009", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T17-36-45.221009.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T17-36-45.221009.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_14T17_36_45.221009", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T17-36-45.221009.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T17-36-45.221009.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_14T17_36_45.221009", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T17-36-45.221009.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T17-36-45.221009.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_14T17_36_45.221009", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T17-36-45.221009.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T17-36-45.221009.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_14T17_36_45.221009", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T17-36-45.221009.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T17-36-45.221009.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_14T17_36_45.221009", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T17-36-45.221009.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T17-36-45.221009.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_14T17_36_45.221009", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T17-36-45.221009.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T17-36-45.221009.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_14T17_36_45.221009", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T17-36-45.221009.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T17-36-45.221009.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_14T17_36_45.221009", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T17-36-45.221009.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T17-36-45.221009.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_14T17_36_45.221009", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T17-36-45.221009.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T17-36-45.221009.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_14T17_36_45.221009", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T17-36-45.221009.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T17-36-45.221009.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_14T17_36_45.221009", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T17-36-45.221009.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T17-36-45.221009.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_14T17_36_45.221009", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T17-36-45.221009.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T17-36-45.221009.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_14T17_36_45.221009", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T17-36-45.221009.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T17-36-45.221009.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_14T17_36_45.221009", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T17-36-45.221009.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T17-36-45.221009.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_14T17_36_45.221009", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T17-36-45.221009.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T17-36-45.221009.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_14T17_36_45.221009", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T17-36-45.221009.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T17-36-45.221009.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_14T17_36_45.221009", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T17-36-45.221009.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T17-36-45.221009.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_14T17_36_45.221009", "path": ["**/details_harness|winogrande|5_2024-01-14T17-36-45.221009.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-14T17-36-45.221009.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_14T17_36_45.221009", "path": ["results_2024-01-14T17-36-45.221009.parquet"]}, {"split": "latest", "path": ["results_2024-01-14T17-36-45.221009.parquet"]}]}]} | 2024-01-14T17:39:25+00:00 |
9c948bc3d783828325b229931cdb5515e49b147c | jmaczan/TORGO-very-small | [
"task_categories:automatic-speech-recognition",
"size_categories:n<1K",
"language:en",
"license:other",
"dysarthria",
"region:us"
] | 2024-01-14T17:46:05+00:00 | {"language": ["en"], "license": "other", "size_categories": ["n<1K"], "task_categories": ["automatic-speech-recognition"], "pretty_name": "TORGO very small", "license_name": "torgo-dataset-license", "license_link": "https://www.cs.toronto.edu/~complingweb/data/TORGO/torgo.html", "tags": ["dysarthria"]} | 2024-01-16T19:40:04+00:00 |
|
20d4a4c3794e790a0284d5351568452194cd0dee | Abhinav-B/finetune_llama_wikisql | [
"region:us"
] | 2024-01-14T17:47:16+00:00 | {"dataset_info": {"features": [{"name": "formatted_text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1530908, "num_examples": 10000}], "download_size": 703398, "dataset_size": 1530908}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-15T22:22:53+00:00 |
|
784ffbf7862cb01c0639299db556d1bc107c87ce | guanxiongsun/got10k | [
"region:us"
] | 2024-01-14T17:50:56+00:00 | {} | 2024-01-14T17:50:56+00:00 |
|
91447063a2898086efb844ba9aa08c025dda13fe | epinnock/software-architecture-instructions-preference | [
"region:us"
] | 2024-01-14T17:52:57+00:00 | {"dataset_info": {"features": [{"name": "input", "dtype": "string"}, {"name": "generation_model", "sequence": "string"}, {"name": "generation_prompt", "list": {"list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}}, {"name": "raw_generation_responses", "sequence": "string"}, {"name": "generations", "sequence": "string"}, {"name": "labelling_model", "dtype": "string"}, {"name": "labelling_prompt", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "raw_labelling_response", "dtype": "string"}, {"name": "rating", "sequence": "float64"}, {"name": "rationale", "sequence": "string"}], "splits": [{"name": "train", "num_bytes": 1356546, "num_examples": 50}], "download_size": 558838, "dataset_size": 1356546}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-14T17:52:58+00:00 |
|
48aa3e48df21521082ae8033b5d1124d34eec6c4 | pedromigurasdev/llama_2_jose_antorcha | [
"license:apache-2.0",
"region:us"
] | 2024-01-14T17:54:14+00:00 | {"license": "apache-2.0"} | 2024-01-14T17:54:35+00:00 |
|
18adf4ae3967299e56c115831cdd45d0edb0c2dc | ayoubkirouane/Orca-Direct-Preference-Optimization | [
"region:us"
] | 2024-01-14T17:59:37+00:00 | {"dataset_info": {"features": [{"name": "chosen", "dtype": "string"}, {"name": "rejected", "dtype": "string"}, {"name": "prompt", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 35914686, "num_examples": 12859}], "download_size": 19539812, "dataset_size": 35914686}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-14T17:59:50+00:00 |
|
de6dde6e078864329bdff2c1903f3145ffa0f13e | jilp00/youtoks-transcripts-Stanford-CS25-Transformers-United | [
"region:us"
] | 2024-01-14T18:02:36+00:00 | {"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1159306, "num_examples": 1390}], "download_size": 619585, "dataset_size": 1159306}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-14T18:02:38+00:00 |
|
eeb114a404f3a53cea7d07926cc1bd6f0eef7668 | jilp00/youtoks-transformers | [
"region:us"
] | 2024-01-14T18:03:32+00:00 | {"dataset_info": {"features": [{"name": "text", "dtype": "string"}, {"name": "token_count", "dtype": "int64"}, {"name": "response", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 2092099, "num_examples": 1390}], "download_size": 1025873, "dataset_size": 2092099}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-14T18:03:35+00:00 |
|
85c6d3e1ba23b2fe67b062ca7cc0c6ed8ae6666c |
# Dataset Card for Evaluation run of one-man-army/UNA-34Beagles-32K-bf16-v1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [one-man-army/UNA-34Beagles-32K-bf16-v1](https://huggingface.co/one-man-army/UNA-34Beagles-32K-bf16-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_one-man-army__UNA-34Beagles-32K-bf16-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T18:01:24.840782](https://huggingface.co/datasets/open-llm-leaderboard/details_one-man-army__UNA-34Beagles-32K-bf16-v1/blob/main/results_2024-01-14T18-01-24.840782.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7603825099190668,
"acc_stderr": 0.028403734149400593,
"acc_norm": 0.7656218376316938,
"acc_norm_stderr": 0.02893068310994367,
"mc1": 0.5887392900856793,
"mc1_stderr": 0.01722562708366087,
"mc2": 0.7354905615781797,
"mc2_stderr": 0.014104277111112697
},
"harness|arc:challenge|25": {
"acc": 0.7047781569965871,
"acc_stderr": 0.01332975029338232,
"acc_norm": 0.735494880546075,
"acc_norm_stderr": 0.012889272949313368
},
"harness|hellaswag|10": {
"acc": 0.6716789484166501,
"acc_stderr": 0.004686425851253278,
"acc_norm": 0.85929097789285,
"acc_norm_stderr": 0.00347010499020439
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7481481481481481,
"acc_stderr": 0.03749850709174021,
"acc_norm": 0.7481481481481481,
"acc_norm_stderr": 0.03749850709174021
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8552631578947368,
"acc_stderr": 0.028631951845930384,
"acc_norm": 0.8552631578947368,
"acc_norm_stderr": 0.028631951845930384
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8150943396226416,
"acc_stderr": 0.023893351834464317,
"acc_norm": 0.8150943396226416,
"acc_norm_stderr": 0.023893351834464317
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8680555555555556,
"acc_stderr": 0.02830096838204443,
"acc_norm": 0.8680555555555556,
"acc_norm_stderr": 0.02830096838204443
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.64,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.03345036916788992,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.03345036916788992
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653695,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653695
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7914893617021277,
"acc_stderr": 0.02655698211783874,
"acc_norm": 0.7914893617021277,
"acc_norm_stderr": 0.02655698211783874
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5877192982456141,
"acc_stderr": 0.04630653203366596,
"acc_norm": 0.5877192982456141,
"acc_norm_stderr": 0.04630653203366596
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7448275862068966,
"acc_stderr": 0.03632984052707842,
"acc_norm": 0.7448275862068966,
"acc_norm_stderr": 0.03632984052707842
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.02326651221373058,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.02326651221373058
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5873015873015873,
"acc_stderr": 0.04403438954768176,
"acc_norm": 0.5873015873015873,
"acc_norm_stderr": 0.04403438954768176
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.9096774193548387,
"acc_stderr": 0.016306570644488323,
"acc_norm": 0.9096774193548387,
"acc_norm_stderr": 0.016306570644488323
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6305418719211823,
"acc_stderr": 0.033959703819985726,
"acc_norm": 0.6305418719211823,
"acc_norm_stderr": 0.033959703819985726
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036623,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036623
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8666666666666667,
"acc_stderr": 0.026544435312706463,
"acc_norm": 0.8666666666666667,
"acc_norm_stderr": 0.026544435312706463
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9191919191919192,
"acc_stderr": 0.019417681889724536,
"acc_norm": 0.9191919191919192,
"acc_norm_stderr": 0.019417681889724536
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9585492227979274,
"acc_stderr": 0.01438543285747644,
"acc_norm": 0.9585492227979274,
"acc_norm_stderr": 0.01438543285747644
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8179487179487179,
"acc_stderr": 0.019565236782930893,
"acc_norm": 0.8179487179487179,
"acc_norm_stderr": 0.019565236782930893
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.45185185185185184,
"acc_stderr": 0.030343862998512623,
"acc_norm": 0.45185185185185184,
"acc_norm_stderr": 0.030343862998512623
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8445378151260504,
"acc_stderr": 0.023536818625398897,
"acc_norm": 0.8445378151260504,
"acc_norm_stderr": 0.023536818625398897
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4768211920529801,
"acc_stderr": 0.04078093859163083,
"acc_norm": 0.4768211920529801,
"acc_norm_stderr": 0.04078093859163083
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9211009174311927,
"acc_stderr": 0.0115581981137696,
"acc_norm": 0.9211009174311927,
"acc_norm_stderr": 0.0115581981137696
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6805555555555556,
"acc_stderr": 0.03179876342176851,
"acc_norm": 0.6805555555555556,
"acc_norm_stderr": 0.03179876342176851
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9117647058823529,
"acc_stderr": 0.019907399791316945,
"acc_norm": 0.9117647058823529,
"acc_norm_stderr": 0.019907399791316945
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9071729957805907,
"acc_stderr": 0.01888975055095671,
"acc_norm": 0.9071729957805907,
"acc_norm_stderr": 0.01888975055095671
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7847533632286996,
"acc_stderr": 0.02758406660220827,
"acc_norm": 0.7847533632286996,
"acc_norm_stderr": 0.02758406660220827
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8702290076335878,
"acc_stderr": 0.029473649496907065,
"acc_norm": 0.8702290076335878,
"acc_norm_stderr": 0.029473649496907065
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.9008264462809917,
"acc_stderr": 0.027285246312758957,
"acc_norm": 0.9008264462809917,
"acc_norm_stderr": 0.027285246312758957
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8981481481481481,
"acc_stderr": 0.029239272675632748,
"acc_norm": 0.8981481481481481,
"acc_norm_stderr": 0.029239272675632748
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8650306748466258,
"acc_stderr": 0.026845765054553838,
"acc_norm": 0.8650306748466258,
"acc_norm_stderr": 0.026845765054553838
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5357142857142857,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.5357142857142857,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.8543689320388349,
"acc_stderr": 0.034926064766237906,
"acc_norm": 0.8543689320388349,
"acc_norm_stderr": 0.034926064766237906
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9444444444444444,
"acc_stderr": 0.015006312806446912,
"acc_norm": 0.9444444444444444,
"acc_norm_stderr": 0.015006312806446912
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776334,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776334
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9042145593869731,
"acc_stderr": 0.010524031079055831,
"acc_norm": 0.9042145593869731,
"acc_norm_stderr": 0.010524031079055831
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8121387283236994,
"acc_stderr": 0.021029269752423203,
"acc_norm": 0.8121387283236994,
"acc_norm_stderr": 0.021029269752423203
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.7854748603351955,
"acc_stderr": 0.013728923407828855,
"acc_norm": 0.7854748603351955,
"acc_norm_stderr": 0.013728923407828855
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8496732026143791,
"acc_stderr": 0.020464175124332625,
"acc_norm": 0.8496732026143791,
"acc_norm_stderr": 0.020464175124332625
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8102893890675241,
"acc_stderr": 0.022268196258783228,
"acc_norm": 0.8102893890675241,
"acc_norm_stderr": 0.022268196258783228
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8672839506172839,
"acc_stderr": 0.018877353839571842,
"acc_norm": 0.8672839506172839,
"acc_norm_stderr": 0.018877353839571842
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6276595744680851,
"acc_stderr": 0.02883892147125145,
"acc_norm": 0.6276595744680851,
"acc_norm_stderr": 0.02883892147125145
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5788787483702738,
"acc_stderr": 0.012610325733489905,
"acc_norm": 0.5788787483702738,
"acc_norm_stderr": 0.012610325733489905
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8198529411764706,
"acc_stderr": 0.02334516361654485,
"acc_norm": 0.8198529411764706,
"acc_norm_stderr": 0.02334516361654485
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.815359477124183,
"acc_stderr": 0.015697029240757776,
"acc_norm": 0.815359477124183,
"acc_norm_stderr": 0.015697029240757776
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7363636363636363,
"acc_stderr": 0.04220224692971987,
"acc_norm": 0.7363636363636363,
"acc_norm_stderr": 0.04220224692971987
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8163265306122449,
"acc_stderr": 0.02478907133200765,
"acc_norm": 0.8163265306122449,
"acc_norm_stderr": 0.02478907133200765
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8855721393034826,
"acc_stderr": 0.022509345325101706,
"acc_norm": 0.8855721393034826,
"acc_norm_stderr": 0.022509345325101706
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352203,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352203
},
"harness|hendrycksTest-virology|5": {
"acc": 0.572289156626506,
"acc_stderr": 0.038515976837185335,
"acc_norm": 0.572289156626506,
"acc_norm_stderr": 0.038515976837185335
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8947368421052632,
"acc_stderr": 0.023537557657892567,
"acc_norm": 0.8947368421052632,
"acc_norm_stderr": 0.023537557657892567
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5887392900856793,
"mc1_stderr": 0.01722562708366087,
"mc2": 0.7354905615781797,
"mc2_stderr": 0.014104277111112697
},
"harness|winogrande|5": {
"acc": 0.829518547750592,
"acc_stderr": 0.0105690211228259
},
"harness|gsm8k|5": {
"acc": 0.6004548900682335,
"acc_stderr": 0.013491660298815985
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_one-man-army__UNA-34Beagles-32K-bf16-v1 | [
"region:us"
] | 2024-01-14T18:03:41+00:00 | {"pretty_name": "Evaluation run of one-man-army/UNA-34Beagles-32K-bf16-v1", "dataset_summary": "Dataset automatically created during the evaluation run of model [one-man-army/UNA-34Beagles-32K-bf16-v1](https://huggingface.co/one-man-army/UNA-34Beagles-32K-bf16-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_one-man-army__UNA-34Beagles-32K-bf16-v1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-14T18:01:24.840782](https://huggingface.co/datasets/open-llm-leaderboard/details_one-man-army__UNA-34Beagles-32K-bf16-v1/blob/main/results_2024-01-14T18-01-24.840782.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7603825099190668,\n \"acc_stderr\": 0.028403734149400593,\n \"acc_norm\": 0.7656218376316938,\n \"acc_norm_stderr\": 0.02893068310994367,\n \"mc1\": 0.5887392900856793,\n \"mc1_stderr\": 0.01722562708366087,\n \"mc2\": 0.7354905615781797,\n \"mc2_stderr\": 0.014104277111112697\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7047781569965871,\n \"acc_stderr\": 0.01332975029338232,\n \"acc_norm\": 0.735494880546075,\n \"acc_norm_stderr\": 0.012889272949313368\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6716789484166501,\n \"acc_stderr\": 0.004686425851253278,\n \"acc_norm\": 0.85929097789285,\n \"acc_norm_stderr\": 0.00347010499020439\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7481481481481481,\n \"acc_stderr\": 0.03749850709174021,\n \"acc_norm\": 0.7481481481481481,\n \"acc_norm_stderr\": 0.03749850709174021\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.8552631578947368,\n \"acc_stderr\": 0.028631951845930384,\n \"acc_norm\": 0.8552631578947368,\n \"acc_norm_stderr\": 0.028631951845930384\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.8150943396226416,\n \"acc_stderr\": 0.023893351834464317,\n \"acc_norm\": 0.8150943396226416,\n \"acc_norm_stderr\": 0.023893351834464317\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8680555555555556,\n \"acc_stderr\": 0.02830096838204443,\n \"acc_norm\": 0.8680555555555556,\n \"acc_norm_stderr\": 0.02830096838204443\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.03345036916788992,\n \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.03345036916788992\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.049406356306056595,\n \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.049406356306056595\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653695,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653695\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7914893617021277,\n \"acc_stderr\": 0.02655698211783874,\n \"acc_norm\": 0.7914893617021277,\n \"acc_norm_stderr\": 0.02655698211783874\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5877192982456141,\n \"acc_stderr\": 0.04630653203366596,\n \"acc_norm\": 0.5877192982456141,\n \"acc_norm_stderr\": 0.04630653203366596\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7448275862068966,\n \"acc_stderr\": 0.03632984052707842,\n \"acc_norm\": 0.7448275862068966,\n \"acc_norm_stderr\": 0.03632984052707842\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.02326651221373058,\n \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.02326651221373058\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5873015873015873,\n \"acc_stderr\": 0.04403438954768176,\n \"acc_norm\": 0.5873015873015873,\n \"acc_norm_stderr\": 0.04403438954768176\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.9096774193548387,\n \"acc_stderr\": 0.016306570644488323,\n \"acc_norm\": 0.9096774193548387,\n \"acc_norm_stderr\": 0.016306570644488323\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6305418719211823,\n \"acc_stderr\": 0.033959703819985726,\n \"acc_norm\": 0.6305418719211823,\n \"acc_norm_stderr\": 0.033959703819985726\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036623,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036623\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8666666666666667,\n \"acc_stderr\": 0.026544435312706463,\n \"acc_norm\": 0.8666666666666667,\n \"acc_norm_stderr\": 0.026544435312706463\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9191919191919192,\n \"acc_stderr\": 0.019417681889724536,\n \"acc_norm\": 0.9191919191919192,\n \"acc_norm_stderr\": 0.019417681889724536\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9585492227979274,\n \"acc_stderr\": 0.01438543285747644,\n \"acc_norm\": 0.9585492227979274,\n \"acc_norm_stderr\": 0.01438543285747644\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.8179487179487179,\n \"acc_stderr\": 0.019565236782930893,\n \"acc_norm\": 0.8179487179487179,\n \"acc_norm_stderr\": 0.019565236782930893\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.45185185185185184,\n \"acc_stderr\": 0.030343862998512623,\n \"acc_norm\": 0.45185185185185184,\n \"acc_norm_stderr\": 0.030343862998512623\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8445378151260504,\n \"acc_stderr\": 0.023536818625398897,\n \"acc_norm\": 0.8445378151260504,\n \"acc_norm_stderr\": 0.023536818625398897\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.4768211920529801,\n \"acc_stderr\": 0.04078093859163083,\n \"acc_norm\": 0.4768211920529801,\n \"acc_norm_stderr\": 0.04078093859163083\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9211009174311927,\n \"acc_stderr\": 0.0115581981137696,\n \"acc_norm\": 0.9211009174311927,\n \"acc_norm_stderr\": 0.0115581981137696\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6805555555555556,\n \"acc_stderr\": 0.03179876342176851,\n \"acc_norm\": 0.6805555555555556,\n \"acc_norm_stderr\": 0.03179876342176851\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9117647058823529,\n \"acc_stderr\": 0.019907399791316945,\n \"acc_norm\": 0.9117647058823529,\n \"acc_norm_stderr\": 0.019907399791316945\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.9071729957805907,\n \"acc_stderr\": 0.01888975055095671,\n \"acc_norm\": 0.9071729957805907,\n \"acc_norm_stderr\": 0.01888975055095671\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7847533632286996,\n \"acc_stderr\": 0.02758406660220827,\n \"acc_norm\": 0.7847533632286996,\n \"acc_norm_stderr\": 0.02758406660220827\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8702290076335878,\n \"acc_stderr\": 0.029473649496907065,\n \"acc_norm\": 0.8702290076335878,\n \"acc_norm_stderr\": 0.029473649496907065\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.9008264462809917,\n \"acc_stderr\": 0.027285246312758957,\n \"acc_norm\": 0.9008264462809917,\n \"acc_norm_stderr\": 0.027285246312758957\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8981481481481481,\n \"acc_stderr\": 0.029239272675632748,\n \"acc_norm\": 0.8981481481481481,\n \"acc_norm_stderr\": 0.029239272675632748\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8650306748466258,\n \"acc_stderr\": 0.026845765054553838,\n \"acc_norm\": 0.8650306748466258,\n \"acc_norm_stderr\": 0.026845765054553838\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5357142857142857,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.5357142857142857,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8543689320388349,\n \"acc_stderr\": 0.034926064766237906,\n \"acc_norm\": 0.8543689320388349,\n \"acc_norm_stderr\": 0.034926064766237906\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9444444444444444,\n \"acc_stderr\": 0.015006312806446912,\n \"acc_norm\": 0.9444444444444444,\n \"acc_norm_stderr\": 0.015006312806446912\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9042145593869731,\n \"acc_stderr\": 0.010524031079055831,\n \"acc_norm\": 0.9042145593869731,\n \"acc_norm_stderr\": 0.010524031079055831\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8121387283236994,\n \"acc_stderr\": 0.021029269752423203,\n \"acc_norm\": 0.8121387283236994,\n \"acc_norm_stderr\": 0.021029269752423203\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.7854748603351955,\n \"acc_stderr\": 0.013728923407828855,\n \"acc_norm\": 0.7854748603351955,\n \"acc_norm_stderr\": 0.013728923407828855\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8496732026143791,\n \"acc_stderr\": 0.020464175124332625,\n \"acc_norm\": 0.8496732026143791,\n \"acc_norm_stderr\": 0.020464175124332625\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8102893890675241,\n \"acc_stderr\": 0.022268196258783228,\n \"acc_norm\": 0.8102893890675241,\n \"acc_norm_stderr\": 0.022268196258783228\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8672839506172839,\n \"acc_stderr\": 0.018877353839571842,\n \"acc_norm\": 0.8672839506172839,\n \"acc_norm_stderr\": 0.018877353839571842\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.6276595744680851,\n \"acc_stderr\": 0.02883892147125145,\n \"acc_norm\": 0.6276595744680851,\n \"acc_norm_stderr\": 0.02883892147125145\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5788787483702738,\n \"acc_stderr\": 0.012610325733489905,\n \"acc_norm\": 0.5788787483702738,\n \"acc_norm_stderr\": 0.012610325733489905\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8198529411764706,\n \"acc_stderr\": 0.02334516361654485,\n \"acc_norm\": 0.8198529411764706,\n \"acc_norm_stderr\": 0.02334516361654485\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.815359477124183,\n \"acc_stderr\": 0.015697029240757776,\n \"acc_norm\": 0.815359477124183,\n \"acc_norm_stderr\": 0.015697029240757776\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7363636363636363,\n \"acc_stderr\": 0.04220224692971987,\n \"acc_norm\": 0.7363636363636363,\n \"acc_norm_stderr\": 0.04220224692971987\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8163265306122449,\n \"acc_stderr\": 0.02478907133200765,\n \"acc_norm\": 0.8163265306122449,\n \"acc_norm_stderr\": 0.02478907133200765\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8855721393034826,\n \"acc_stderr\": 0.022509345325101706,\n \"acc_norm\": 0.8855721393034826,\n \"acc_norm_stderr\": 0.022509345325101706\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.572289156626506,\n \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.572289156626506,\n \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8947368421052632,\n \"acc_stderr\": 0.023537557657892567,\n \"acc_norm\": 0.8947368421052632,\n \"acc_norm_stderr\": 0.023537557657892567\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5887392900856793,\n \"mc1_stderr\": 0.01722562708366087,\n \"mc2\": 0.7354905615781797,\n \"mc2_stderr\": 0.014104277111112697\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.829518547750592,\n \"acc_stderr\": 0.0105690211228259\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6004548900682335,\n \"acc_stderr\": 0.013491660298815985\n }\n}\n```", "repo_url": "https://huggingface.co/one-man-army/UNA-34Beagles-32K-bf16-v1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_14T18_01_24.840782", "path": ["**/details_harness|arc:challenge|25_2024-01-14T18-01-24.840782.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-14T18-01-24.840782.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_14T18_01_24.840782", "path": ["**/details_harness|gsm8k|5_2024-01-14T18-01-24.840782.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-14T18-01-24.840782.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_14T18_01_24.840782", "path": ["**/details_harness|hellaswag|10_2024-01-14T18-01-24.840782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-14T18-01-24.840782.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_14T18_01_24.840782", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T18-01-24.840782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T18-01-24.840782.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T18-01-24.840782.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_14T18_01_24.840782", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T18-01-24.840782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T18-01-24.840782.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_14T18_01_24.840782", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T18-01-24.840782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T18-01-24.840782.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_14T18_01_24.840782", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T18-01-24.840782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T18-01-24.840782.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_14T18_01_24.840782", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T18-01-24.840782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T18-01-24.840782.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_14T18_01_24.840782", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T18-01-24.840782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T18-01-24.840782.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_14T18_01_24.840782", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T18-01-24.840782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T18-01-24.840782.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_14T18_01_24.840782", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T18-01-24.840782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T18-01-24.840782.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_14T18_01_24.840782", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T18-01-24.840782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T18-01-24.840782.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_14T18_01_24.840782", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T18-01-24.840782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T18-01-24.840782.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_14T18_01_24.840782", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T18-01-24.840782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T18-01-24.840782.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_14T18_01_24.840782", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T18-01-24.840782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T18-01-24.840782.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_14T18_01_24.840782", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T18-01-24.840782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T18-01-24.840782.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_14T18_01_24.840782", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T18-01-24.840782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T18-01-24.840782.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_14T18_01_24.840782", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T18-01-24.840782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T18-01-24.840782.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_14T18_01_24.840782", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T18-01-24.840782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T18-01-24.840782.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_14T18_01_24.840782", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T18-01-24.840782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T18-01-24.840782.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_14T18_01_24.840782", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T18-01-24.840782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T18-01-24.840782.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_14T18_01_24.840782", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T18-01-24.840782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T18-01-24.840782.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_14T18_01_24.840782", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T18-01-24.840782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T18-01-24.840782.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_14T18_01_24.840782", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T18-01-24.840782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T18-01-24.840782.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_14T18_01_24.840782", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T18-01-24.840782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T18-01-24.840782.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_14T18_01_24.840782", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T18-01-24.840782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T18-01-24.840782.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_14T18_01_24.840782", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T18-01-24.840782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T18-01-24.840782.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_14T18_01_24.840782", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T18-01-24.840782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T18-01-24.840782.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_14T18_01_24.840782", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T18-01-24.840782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T18-01-24.840782.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_14T18_01_24.840782", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T18-01-24.840782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T18-01-24.840782.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_14T18_01_24.840782", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T18-01-24.840782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T18-01-24.840782.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_14T18_01_24.840782", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T18-01-24.840782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T18-01-24.840782.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_14T18_01_24.840782", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T18-01-24.840782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T18-01-24.840782.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_14T18_01_24.840782", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T18-01-24.840782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T18-01-24.840782.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_14T18_01_24.840782", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T18-01-24.840782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T18-01-24.840782.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_14T18_01_24.840782", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T18-01-24.840782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T18-01-24.840782.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_14T18_01_24.840782", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T18-01-24.840782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T18-01-24.840782.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_14T18_01_24.840782", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T18-01-24.840782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T18-01-24.840782.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_14T18_01_24.840782", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T18-01-24.840782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T18-01-24.840782.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_14T18_01_24.840782", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T18-01-24.840782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T18-01-24.840782.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_14T18_01_24.840782", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T18-01-24.840782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T18-01-24.840782.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_14T18_01_24.840782", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T18-01-24.840782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T18-01-24.840782.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_14T18_01_24.840782", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T18-01-24.840782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T18-01-24.840782.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_14T18_01_24.840782", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T18-01-24.840782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T18-01-24.840782.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_14T18_01_24.840782", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T18-01-24.840782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T18-01-24.840782.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_14T18_01_24.840782", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T18-01-24.840782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T18-01-24.840782.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_14T18_01_24.840782", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T18-01-24.840782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T18-01-24.840782.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_14T18_01_24.840782", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T18-01-24.840782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T18-01-24.840782.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_14T18_01_24.840782", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T18-01-24.840782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T18-01-24.840782.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_14T18_01_24.840782", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T18-01-24.840782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T18-01-24.840782.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_14T18_01_24.840782", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T18-01-24.840782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T18-01-24.840782.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_14T18_01_24.840782", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T18-01-24.840782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T18-01-24.840782.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_14T18_01_24.840782", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T18-01-24.840782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T18-01-24.840782.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_14T18_01_24.840782", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T18-01-24.840782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T18-01-24.840782.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_14T18_01_24.840782", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T18-01-24.840782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T18-01-24.840782.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_14T18_01_24.840782", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T18-01-24.840782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T18-01-24.840782.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_14T18_01_24.840782", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T18-01-24.840782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T18-01-24.840782.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_14T18_01_24.840782", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T18-01-24.840782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T18-01-24.840782.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_14T18_01_24.840782", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T18-01-24.840782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T18-01-24.840782.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_14T18_01_24.840782", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T18-01-24.840782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T18-01-24.840782.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_14T18_01_24.840782", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T18-01-24.840782.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T18-01-24.840782.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_14T18_01_24.840782", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T18-01-24.840782.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T18-01-24.840782.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_14T18_01_24.840782", "path": ["**/details_harness|winogrande|5_2024-01-14T18-01-24.840782.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-14T18-01-24.840782.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_14T18_01_24.840782", "path": ["results_2024-01-14T18-01-24.840782.parquet"]}, {"split": "latest", "path": ["results_2024-01-14T18-01-24.840782.parquet"]}]}]} | 2024-01-14T18:04:02+00:00 |
307f1064e6d6046dc2d5fbb81185cf6700ff7630 | Obreyer/freddy | [
"license:openrail",
"region:us"
] | 2024-01-14T18:07:39+00:00 | {"license": "openrail"} | 2024-01-14T18:09:47+00:00 |
|
5ed262282cbe1119ce881f0e8b25206c205e9345 |
# Dataset Card for Evaluation run of argilla/distilabeled-Marcoro14-7B-slerp-full
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [argilla/distilabeled-Marcoro14-7B-slerp-full](https://huggingface.co/argilla/distilabeled-Marcoro14-7B-slerp-full) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_argilla__distilabeled-Marcoro14-7B-slerp-full",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T18:07:10.931926](https://huggingface.co/datasets/open-llm-leaderboard/details_argilla__distilabeled-Marcoro14-7B-slerp-full/blob/main/results_2024-01-14T18-07-10.931926.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6579983930115316,
"acc_stderr": 0.031959390460197495,
"acc_norm": 0.6579231845624166,
"acc_norm_stderr": 0.03261951935121804,
"mc1": 0.48225214198286415,
"mc1_stderr": 0.017492470843075363,
"mc2": 0.6421417472476668,
"mc2_stderr": 0.015159369575596757
},
"harness|arc:challenge|25": {
"acc": 0.6783276450511946,
"acc_stderr": 0.013650488084494166,
"acc_norm": 0.7064846416382252,
"acc_norm_stderr": 0.01330725044494111
},
"harness|hellaswag|10": {
"acc": 0.6974706233817964,
"acc_stderr": 0.004584144014654942,
"acc_norm": 0.8755228042222665,
"acc_norm_stderr": 0.0032945048075552286
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6814814814814815,
"acc_stderr": 0.04024778401977108,
"acc_norm": 0.6814814814814815,
"acc_norm_stderr": 0.04024778401977108
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.037385206761196686,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.037385206761196686
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.63,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7283018867924528,
"acc_stderr": 0.027377706624670713,
"acc_norm": 0.7283018867924528,
"acc_norm_stderr": 0.027377706624670713
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932262,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932262
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5914893617021276,
"acc_stderr": 0.032134180267015755,
"acc_norm": 0.5914893617021276,
"acc_norm_stderr": 0.032134180267015755
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.43386243386243384,
"acc_stderr": 0.02552503438247489,
"acc_norm": 0.43386243386243384,
"acc_norm_stderr": 0.02552503438247489
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.023157879349083525,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.023157879349083525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.02860620428922987,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.02860620428922987
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328974,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328974
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6743589743589744,
"acc_stderr": 0.02375966576741229,
"acc_norm": 0.6743589743589744,
"acc_norm_stderr": 0.02375966576741229
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.029116617606083008,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.029116617606083008
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.030388353551886786,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.030388353551886786
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8532110091743119,
"acc_stderr": 0.015173141845126243,
"acc_norm": 0.8532110091743119,
"acc_norm_stderr": 0.015173141845126243
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.03400603625538271,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.03400603625538271
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8578431372549019,
"acc_stderr": 0.0245098039215686,
"acc_norm": 0.8578431372549019,
"acc_norm_stderr": 0.0245098039215686
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601443,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601443
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.03076935200822914,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.03076935200822914
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092368,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092368
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8288633461047255,
"acc_stderr": 0.013468201614066307,
"acc_norm": 0.8288633461047255,
"acc_norm_stderr": 0.013468201614066307
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.023445826276545543,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.023445826276545543
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.44581005586592176,
"acc_stderr": 0.016623998513333103,
"acc_norm": 0.44581005586592176,
"acc_norm_stderr": 0.016623998513333103
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7189542483660131,
"acc_stderr": 0.025738854797818737,
"acc_norm": 0.7189542483660131,
"acc_norm_stderr": 0.025738854797818737
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7561728395061729,
"acc_stderr": 0.023891879541959603,
"acc_norm": 0.7561728395061729,
"acc_norm_stderr": 0.023891879541959603
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5283687943262412,
"acc_stderr": 0.02977945095730305,
"acc_norm": 0.5283687943262412,
"acc_norm_stderr": 0.02977945095730305
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4654498044328553,
"acc_stderr": 0.012739711554045699,
"acc_norm": 0.4654498044328553,
"acc_norm_stderr": 0.012739711554045699
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.684640522875817,
"acc_stderr": 0.01879808628488689,
"acc_norm": 0.684640522875817,
"acc_norm_stderr": 0.01879808628488689
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142783,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142783
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616914,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616914
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160893,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160893
},
"harness|truthfulqa:mc|0": {
"mc1": 0.48225214198286415,
"mc1_stderr": 0.017492470843075363,
"mc2": 0.6421417472476668,
"mc2_stderr": 0.015159369575596757
},
"harness|winogrande|5": {
"acc": 0.8200473559589582,
"acc_stderr": 0.01079646868806868
},
"harness|gsm8k|5": {
"acc": 0.7065959059893859,
"acc_stderr": 0.01254183081546149
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_argilla__distilabeled-Marcoro14-7B-slerp-full | [
"region:us"
] | 2024-01-14T18:09:48+00:00 | {"pretty_name": "Evaluation run of argilla/distilabeled-Marcoro14-7B-slerp-full", "dataset_summary": "Dataset automatically created during the evaluation run of model [argilla/distilabeled-Marcoro14-7B-slerp-full](https://huggingface.co/argilla/distilabeled-Marcoro14-7B-slerp-full) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_argilla__distilabeled-Marcoro14-7B-slerp-full\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-14T18:07:10.931926](https://huggingface.co/datasets/open-llm-leaderboard/details_argilla__distilabeled-Marcoro14-7B-slerp-full/blob/main/results_2024-01-14T18-07-10.931926.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6579983930115316,\n \"acc_stderr\": 0.031959390460197495,\n \"acc_norm\": 0.6579231845624166,\n \"acc_norm_stderr\": 0.03261951935121804,\n \"mc1\": 0.48225214198286415,\n \"mc1_stderr\": 0.017492470843075363,\n \"mc2\": 0.6421417472476668,\n \"mc2_stderr\": 0.015159369575596757\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6783276450511946,\n \"acc_stderr\": 0.013650488084494166,\n \"acc_norm\": 0.7064846416382252,\n \"acc_norm_stderr\": 0.01330725044494111\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6974706233817964,\n \"acc_stderr\": 0.004584144014654942,\n \"acc_norm\": 0.8755228042222665,\n \"acc_norm_stderr\": 0.0032945048075552286\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6814814814814815,\n \"acc_stderr\": 0.04024778401977108,\n \"acc_norm\": 0.6814814814814815,\n \"acc_norm_stderr\": 0.04024778401977108\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.037385206761196686,\n \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.037385206761196686\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7283018867924528,\n \"acc_stderr\": 0.027377706624670713,\n \"acc_norm\": 0.7283018867924528,\n \"acc_norm_stderr\": 0.027377706624670713\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932262,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932262\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5914893617021276,\n \"acc_stderr\": 0.032134180267015755,\n \"acc_norm\": 0.5914893617021276,\n \"acc_norm_stderr\": 0.032134180267015755\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.43386243386243384,\n \"acc_stderr\": 0.02552503438247489,\n \"acc_norm\": 0.43386243386243384,\n \"acc_norm_stderr\": 0.02552503438247489\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n \"acc_stderr\": 0.023157879349083525,\n \"acc_norm\": 0.7903225806451613,\n \"acc_norm_stderr\": 0.023157879349083525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.797979797979798,\n \"acc_stderr\": 0.02860620428922987,\n \"acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.02860620428922987\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328974,\n \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328974\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6743589743589744,\n \"acc_stderr\": 0.02375966576741229,\n \"acc_norm\": 0.6743589743589744,\n \"acc_norm_stderr\": 0.02375966576741229\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.35185185185185186,\n \"acc_stderr\": 0.029116617606083008,\n \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.029116617606083008\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886786,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886786\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8532110091743119,\n \"acc_stderr\": 0.015173141845126243,\n \"acc_norm\": 0.8532110091743119,\n \"acc_norm_stderr\": 0.015173141845126243\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5370370370370371,\n \"acc_stderr\": 0.03400603625538271,\n \"acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.03400603625538271\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8578431372549019,\n \"acc_stderr\": 0.0245098039215686,\n \"acc_norm\": 0.8578431372549019,\n \"acc_norm_stderr\": 0.0245098039215686\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601443,\n \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601443\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n \"acc_stderr\": 0.03076935200822914,\n \"acc_norm\": 0.6995515695067265,\n \"acc_norm_stderr\": 0.03076935200822914\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.020588491316092368,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.020588491316092368\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8288633461047255,\n \"acc_stderr\": 0.013468201614066307,\n \"acc_norm\": 0.8288633461047255,\n \"acc_norm_stderr\": 0.013468201614066307\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.023445826276545543,\n \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.023445826276545543\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.44581005586592176,\n \"acc_stderr\": 0.016623998513333103,\n \"acc_norm\": 0.44581005586592176,\n \"acc_norm_stderr\": 0.016623998513333103\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.025738854797818737,\n \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.025738854797818737\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7561728395061729,\n \"acc_stderr\": 0.023891879541959603,\n \"acc_norm\": 0.7561728395061729,\n \"acc_norm_stderr\": 0.023891879541959603\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5283687943262412,\n \"acc_stderr\": 0.02977945095730305,\n \"acc_norm\": 0.5283687943262412,\n \"acc_norm_stderr\": 0.02977945095730305\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4654498044328553,\n \"acc_stderr\": 0.012739711554045699,\n \"acc_norm\": 0.4654498044328553,\n \"acc_norm_stderr\": 0.012739711554045699\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.684640522875817,\n \"acc_stderr\": 0.01879808628488689,\n \"acc_norm\": 0.684640522875817,\n \"acc_norm_stderr\": 0.01879808628488689\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142783,\n \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142783\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.02587064676616914,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.02587064676616914\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.48225214198286415,\n \"mc1_stderr\": 0.017492470843075363,\n \"mc2\": 0.6421417472476668,\n \"mc2_stderr\": 0.015159369575596757\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8200473559589582,\n \"acc_stderr\": 0.01079646868806868\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7065959059893859,\n \"acc_stderr\": 0.01254183081546149\n }\n}\n```", "repo_url": "https://huggingface.co/argilla/distilabeled-Marcoro14-7B-slerp-full", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_14T18_07_10.931926", "path": ["**/details_harness|arc:challenge|25_2024-01-14T18-07-10.931926.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-14T18-07-10.931926.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_14T18_07_10.931926", "path": ["**/details_harness|gsm8k|5_2024-01-14T18-07-10.931926.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-14T18-07-10.931926.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_14T18_07_10.931926", "path": ["**/details_harness|hellaswag|10_2024-01-14T18-07-10.931926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-14T18-07-10.931926.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_14T18_07_10.931926", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T18-07-10.931926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T18-07-10.931926.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T18-07-10.931926.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_14T18_07_10.931926", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T18-07-10.931926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T18-07-10.931926.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_14T18_07_10.931926", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T18-07-10.931926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T18-07-10.931926.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_14T18_07_10.931926", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T18-07-10.931926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T18-07-10.931926.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_14T18_07_10.931926", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T18-07-10.931926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T18-07-10.931926.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_14T18_07_10.931926", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T18-07-10.931926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T18-07-10.931926.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_14T18_07_10.931926", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T18-07-10.931926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T18-07-10.931926.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_14T18_07_10.931926", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T18-07-10.931926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T18-07-10.931926.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_14T18_07_10.931926", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T18-07-10.931926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T18-07-10.931926.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_14T18_07_10.931926", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T18-07-10.931926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T18-07-10.931926.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_14T18_07_10.931926", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T18-07-10.931926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T18-07-10.931926.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_14T18_07_10.931926", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T18-07-10.931926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T18-07-10.931926.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_14T18_07_10.931926", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T18-07-10.931926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T18-07-10.931926.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_14T18_07_10.931926", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T18-07-10.931926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T18-07-10.931926.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_14T18_07_10.931926", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T18-07-10.931926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T18-07-10.931926.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_14T18_07_10.931926", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T18-07-10.931926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T18-07-10.931926.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_14T18_07_10.931926", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T18-07-10.931926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T18-07-10.931926.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_14T18_07_10.931926", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T18-07-10.931926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T18-07-10.931926.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_14T18_07_10.931926", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T18-07-10.931926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T18-07-10.931926.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_14T18_07_10.931926", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T18-07-10.931926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T18-07-10.931926.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_14T18_07_10.931926", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T18-07-10.931926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T18-07-10.931926.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_14T18_07_10.931926", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T18-07-10.931926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T18-07-10.931926.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_14T18_07_10.931926", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T18-07-10.931926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T18-07-10.931926.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_14T18_07_10.931926", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T18-07-10.931926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T18-07-10.931926.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_14T18_07_10.931926", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T18-07-10.931926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T18-07-10.931926.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_14T18_07_10.931926", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T18-07-10.931926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T18-07-10.931926.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_14T18_07_10.931926", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T18-07-10.931926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T18-07-10.931926.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_14T18_07_10.931926", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T18-07-10.931926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T18-07-10.931926.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_14T18_07_10.931926", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T18-07-10.931926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T18-07-10.931926.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_14T18_07_10.931926", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T18-07-10.931926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T18-07-10.931926.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_14T18_07_10.931926", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T18-07-10.931926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T18-07-10.931926.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_14T18_07_10.931926", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T18-07-10.931926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T18-07-10.931926.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_14T18_07_10.931926", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T18-07-10.931926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T18-07-10.931926.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_14T18_07_10.931926", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T18-07-10.931926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T18-07-10.931926.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_14T18_07_10.931926", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T18-07-10.931926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T18-07-10.931926.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_14T18_07_10.931926", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T18-07-10.931926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T18-07-10.931926.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_14T18_07_10.931926", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T18-07-10.931926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T18-07-10.931926.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_14T18_07_10.931926", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T18-07-10.931926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T18-07-10.931926.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_14T18_07_10.931926", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T18-07-10.931926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T18-07-10.931926.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_14T18_07_10.931926", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T18-07-10.931926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T18-07-10.931926.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_14T18_07_10.931926", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T18-07-10.931926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T18-07-10.931926.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_14T18_07_10.931926", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T18-07-10.931926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T18-07-10.931926.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_14T18_07_10.931926", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T18-07-10.931926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T18-07-10.931926.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_14T18_07_10.931926", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T18-07-10.931926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T18-07-10.931926.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_14T18_07_10.931926", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T18-07-10.931926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T18-07-10.931926.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_14T18_07_10.931926", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T18-07-10.931926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T18-07-10.931926.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_14T18_07_10.931926", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T18-07-10.931926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T18-07-10.931926.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_14T18_07_10.931926", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T18-07-10.931926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T18-07-10.931926.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_14T18_07_10.931926", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T18-07-10.931926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T18-07-10.931926.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_14T18_07_10.931926", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T18-07-10.931926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T18-07-10.931926.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_14T18_07_10.931926", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T18-07-10.931926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T18-07-10.931926.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_14T18_07_10.931926", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T18-07-10.931926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T18-07-10.931926.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_14T18_07_10.931926", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T18-07-10.931926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T18-07-10.931926.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_14T18_07_10.931926", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T18-07-10.931926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T18-07-10.931926.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_14T18_07_10.931926", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T18-07-10.931926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T18-07-10.931926.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_14T18_07_10.931926", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T18-07-10.931926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T18-07-10.931926.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_14T18_07_10.931926", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T18-07-10.931926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T18-07-10.931926.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_14T18_07_10.931926", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T18-07-10.931926.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T18-07-10.931926.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_14T18_07_10.931926", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T18-07-10.931926.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T18-07-10.931926.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_14T18_07_10.931926", "path": ["**/details_harness|winogrande|5_2024-01-14T18-07-10.931926.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-14T18-07-10.931926.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_14T18_07_10.931926", "path": ["results_2024-01-14T18-07-10.931926.parquet"]}, {"split": "latest", "path": ["results_2024-01-14T18-07-10.931926.parquet"]}]}]} | 2024-01-14T18:10:29+00:00 |
123b70c682105d53e7b0e1a2189024d33ad7275d | kpriyanshu256/semeval-mono-test | [
"region:us"
] | 2024-01-14T18:13:21+00:00 | {"dataset_info": {"features": [{"name": "text", "dtype": "string"}, {"name": "id", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 83230306, "num_examples": 34272}], "download_size": 44874416, "dataset_size": 83230306}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-14T18:13:52+00:00 |
|
e64140fdb4ff594925f72eb9b0e02494f898df3f | kpriyanshu256/semeval-multi-test | [
"region:us"
] | 2024-01-14T18:14:05+00:00 | {"dataset_info": {"features": [{"name": "text", "dtype": "string"}, {"name": "id", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 99468808, "num_examples": 42378}], "download_size": 58558248, "dataset_size": 99468808}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-14T18:14:09+00:00 |
|
3274306bdf24458b4177af6935ccfe807a4751f7 | kpriyanshu256/semeval-b-test | [
"region:us"
] | 2024-01-14T18:14:19+00:00 | {"dataset_info": {"features": [{"name": "text", "dtype": "string"}, {"name": "id", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 43138311, "num_examples": 18000}], "download_size": 22358559, "dataset_size": 43138311}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-14T18:14:21+00:00 |
|
9a3a0b22c4adb6c7b19c1c4f82113e9793f743d5 | Tsuinzues/cristianotorreao | [
"license:openrail",
"region:us"
] | 2024-01-14T18:14:36+00:00 | {"license": "openrail"} | 2024-01-14T18:14:49+00:00 |
|
e7d09ab28922fc32dde9eec300c655ec5a5140da |
# Dataset Card for Evaluation run of kz919/mistral-7b-dpo-open-orca-flan-50k-synthetic-5-models
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [kz919/mistral-7b-dpo-open-orca-flan-50k-synthetic-5-models](https://huggingface.co/kz919/mistral-7b-dpo-open-orca-flan-50k-synthetic-5-models) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_kz919__mistral-7b-dpo-open-orca-flan-50k-synthetic-5-models",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T18:15:50.698529](https://huggingface.co/datasets/open-llm-leaderboard/details_kz919__mistral-7b-dpo-open-orca-flan-50k-synthetic-5-models/blob/main/results_2024-01-14T18-15-50.698529.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.26648871501929594,
"acc_stderr": 0.03093030883128489,
"acc_norm": 0.2677809133729311,
"acc_norm_stderr": 0.03175527446298885,
"mc1": 0.2521419828641371,
"mc1_stderr": 0.015201522246299953,
"mc2": 0.4880571743853537,
"mc2_stderr": 0.0172850771661607
},
"harness|arc:challenge|25": {
"acc": 0.20819112627986347,
"acc_stderr": 0.011864866118448064,
"acc_norm": 0.2551194539249147,
"acc_norm_stderr": 0.012739038695202105
},
"harness|hellaswag|10": {
"acc": 0.25692093208524197,
"acc_stderr": 0.004360424536145122,
"acc_norm": 0.2552280422226648,
"acc_norm_stderr": 0.004350982826580604
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.22962962962962963,
"acc_stderr": 0.03633384414073461,
"acc_norm": 0.22962962962962963,
"acc_norm_stderr": 0.03633384414073461
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.34868421052631576,
"acc_stderr": 0.03878139888797611,
"acc_norm": 0.34868421052631576,
"acc_norm_stderr": 0.03878139888797611
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036844,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036844
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2981132075471698,
"acc_stderr": 0.02815283794249386,
"acc_norm": 0.2981132075471698,
"acc_norm_stderr": 0.02815283794249386
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.32947976878612717,
"acc_stderr": 0.03583901754736411,
"acc_norm": 0.32947976878612717,
"acc_norm_stderr": 0.03583901754736411
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082633,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082633
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.18,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.18,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.23829787234042554,
"acc_stderr": 0.027851252973889774,
"acc_norm": 0.23829787234042554,
"acc_norm_stderr": 0.027851252973889774
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.04049339297748141,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.04049339297748141
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2689655172413793,
"acc_stderr": 0.036951833116502325,
"acc_norm": 0.2689655172413793,
"acc_norm_stderr": 0.036951833116502325
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.022644212615525218,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.022644212615525218
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04040610178208841,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04040610178208841
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.18,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.31290322580645163,
"acc_stderr": 0.026377567028645854,
"acc_norm": 0.31290322580645163,
"acc_norm_stderr": 0.026377567028645854
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.270935960591133,
"acc_stderr": 0.031270907132976984,
"acc_norm": 0.270935960591133,
"acc_norm_stderr": 0.031270907132976984
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.17,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.17,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2545454545454545,
"acc_stderr": 0.03401506715249039,
"acc_norm": 0.2545454545454545,
"acc_norm_stderr": 0.03401506715249039
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.35353535353535354,
"acc_stderr": 0.03406086723547153,
"acc_norm": 0.35353535353535354,
"acc_norm_stderr": 0.03406086723547153
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.35233160621761656,
"acc_stderr": 0.03447478286414359,
"acc_norm": 0.35233160621761656,
"acc_norm_stderr": 0.03447478286414359
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3435897435897436,
"acc_stderr": 0.02407869658063547,
"acc_norm": 0.3435897435897436,
"acc_norm_stderr": 0.02407869658063547
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.026842057873833706,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.026842057873833706
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.031041941304059285,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.031041941304059285
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658754,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658754
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3559633027522936,
"acc_stderr": 0.020528559278244218,
"acc_norm": 0.3559633027522936,
"acc_norm_stderr": 0.020528559278244218
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.03388857118502325,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.03388857118502325
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.20675105485232068,
"acc_stderr": 0.026361651668389104,
"acc_norm": 0.20675105485232068,
"acc_norm_stderr": 0.026361651668389104
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.12556053811659193,
"acc_stderr": 0.02223898546932376,
"acc_norm": 0.12556053811659193,
"acc_norm_stderr": 0.02223898546932376
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2824427480916031,
"acc_stderr": 0.03948406125768361,
"acc_norm": 0.2824427480916031,
"acc_norm_stderr": 0.03948406125768361
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.14049586776859505,
"acc_stderr": 0.03172233426002161,
"acc_norm": 0.14049586776859505,
"acc_norm_stderr": 0.03172233426002161
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2392638036809816,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.2392638036809816,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.15178571428571427,
"acc_stderr": 0.03405702838185694,
"acc_norm": 0.15178571428571427,
"acc_norm_stderr": 0.03405702838185694
},
"harness|hendrycksTest-management|5": {
"acc": 0.3786407766990291,
"acc_stderr": 0.04802694698258972,
"acc_norm": 0.3786407766990291,
"acc_norm_stderr": 0.04802694698258972
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.19230769230769232,
"acc_stderr": 0.025819233256483706,
"acc_norm": 0.19230769230769232,
"acc_norm_stderr": 0.025819233256483706
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.20561941251596424,
"acc_stderr": 0.014452500456785825,
"acc_norm": 0.20561941251596424,
"acc_norm_stderr": 0.014452500456785825
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.21965317919075145,
"acc_stderr": 0.022289638852617904,
"acc_norm": 0.21965317919075145,
"acc_norm_stderr": 0.022289638852617904
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27262569832402234,
"acc_stderr": 0.014893391735249588,
"acc_norm": 0.27262569832402234,
"acc_norm_stderr": 0.014893391735249588
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.3006535947712418,
"acc_stderr": 0.02625605383571896,
"acc_norm": 0.3006535947712418,
"acc_norm_stderr": 0.02625605383571896
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.24758842443729903,
"acc_stderr": 0.024513879973621967,
"acc_norm": 0.24758842443729903,
"acc_norm_stderr": 0.024513879973621967
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.22530864197530864,
"acc_stderr": 0.023246202647819746,
"acc_norm": 0.22530864197530864,
"acc_norm_stderr": 0.023246202647819746
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.25177304964539005,
"acc_stderr": 0.0258921511567094,
"acc_norm": 0.25177304964539005,
"acc_norm_stderr": 0.0258921511567094
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.23859191655801826,
"acc_stderr": 0.010885929742002221,
"acc_norm": 0.23859191655801826,
"acc_norm_stderr": 0.010885929742002221
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4375,
"acc_stderr": 0.030134614954403924,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.030134614954403924
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.21405228758169934,
"acc_stderr": 0.01659342966232903,
"acc_norm": 0.21405228758169934,
"acc_norm_stderr": 0.01659342966232903
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.24545454545454545,
"acc_stderr": 0.041220665028782834,
"acc_norm": 0.24545454545454545,
"acc_norm_stderr": 0.041220665028782834
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.39591836734693875,
"acc_stderr": 0.03130802899065686,
"acc_norm": 0.39591836734693875,
"acc_norm_stderr": 0.03130802899065686
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.263681592039801,
"acc_stderr": 0.03115715086935556,
"acc_norm": 0.263681592039801,
"acc_norm_stderr": 0.03115715086935556
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322674,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322674
},
"harness|hendrycksTest-virology|5": {
"acc": 0.21084337349397592,
"acc_stderr": 0.03175554786629921,
"acc_norm": 0.21084337349397592,
"acc_norm_stderr": 0.03175554786629921
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.17543859649122806,
"acc_stderr": 0.029170885500727654,
"acc_norm": 0.17543859649122806,
"acc_norm_stderr": 0.029170885500727654
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2521419828641371,
"mc1_stderr": 0.015201522246299953,
"mc2": 0.4880571743853537,
"mc2_stderr": 0.0172850771661607
},
"harness|winogrande|5": {
"acc": 0.5019731649565904,
"acc_stderr": 0.014052376259225636
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_kz919__mistral-7b-dpo-open-orca-flan-50k-synthetic-5-models | [
"region:us"
] | 2024-01-14T18:18:11+00:00 | {"pretty_name": "Evaluation run of kz919/mistral-7b-dpo-open-orca-flan-50k-synthetic-5-models", "dataset_summary": "Dataset automatically created during the evaluation run of model [kz919/mistral-7b-dpo-open-orca-flan-50k-synthetic-5-models](https://huggingface.co/kz919/mistral-7b-dpo-open-orca-flan-50k-synthetic-5-models) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_kz919__mistral-7b-dpo-open-orca-flan-50k-synthetic-5-models\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-14T18:15:50.698529](https://huggingface.co/datasets/open-llm-leaderboard/details_kz919__mistral-7b-dpo-open-orca-flan-50k-synthetic-5-models/blob/main/results_2024-01-14T18-15-50.698529.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.26648871501929594,\n \"acc_stderr\": 0.03093030883128489,\n \"acc_norm\": 0.2677809133729311,\n \"acc_norm_stderr\": 0.03175527446298885,\n \"mc1\": 0.2521419828641371,\n \"mc1_stderr\": 0.015201522246299953,\n \"mc2\": 0.4880571743853537,\n \"mc2_stderr\": 0.0172850771661607\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.20819112627986347,\n \"acc_stderr\": 0.011864866118448064,\n \"acc_norm\": 0.2551194539249147,\n \"acc_norm_stderr\": 0.012739038695202105\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.25692093208524197,\n \"acc_stderr\": 0.004360424536145122,\n \"acc_norm\": 0.2552280422226648,\n \"acc_norm_stderr\": 0.004350982826580604\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.22962962962962963,\n \"acc_stderr\": 0.03633384414073461,\n \"acc_norm\": 0.22962962962962963,\n \"acc_norm_stderr\": 0.03633384414073461\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.34868421052631576,\n \"acc_stderr\": 0.03878139888797611,\n \"acc_norm\": 0.34868421052631576,\n \"acc_norm_stderr\": 0.03878139888797611\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036844,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036844\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.2981132075471698,\n \"acc_stderr\": 0.02815283794249386,\n \"acc_norm\": 0.2981132075471698,\n \"acc_norm_stderr\": 0.02815283794249386\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.32947976878612717,\n \"acc_stderr\": 0.03583901754736411,\n \"acc_norm\": 0.32947976878612717,\n \"acc_norm_stderr\": 0.03583901754736411\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082633,\n \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082633\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.03861229196653694,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.03861229196653694\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.23829787234042554,\n \"acc_stderr\": 0.027851252973889774,\n \"acc_norm\": 0.23829787234042554,\n \"acc_norm_stderr\": 0.027851252973889774\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n \"acc_stderr\": 0.04049339297748141,\n \"acc_norm\": 0.24561403508771928,\n \"acc_norm_stderr\": 0.04049339297748141\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2689655172413793,\n \"acc_stderr\": 0.036951833116502325,\n \"acc_norm\": 0.2689655172413793,\n \"acc_norm_stderr\": 0.036951833116502325\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2619047619047619,\n \"acc_stderr\": 0.022644212615525218,\n \"acc_norm\": 0.2619047619047619,\n \"acc_norm_stderr\": 0.022644212615525218\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.03861229196653694,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.03861229196653694\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.31290322580645163,\n \"acc_stderr\": 0.026377567028645854,\n \"acc_norm\": 0.31290322580645163,\n \"acc_norm_stderr\": 0.026377567028645854\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.270935960591133,\n \"acc_stderr\": 0.031270907132976984,\n \"acc_norm\": 0.270935960591133,\n \"acc_norm_stderr\": 0.031270907132976984\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.17,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.17,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.2545454545454545,\n \"acc_stderr\": 0.03401506715249039,\n \"acc_norm\": 0.2545454545454545,\n \"acc_norm_stderr\": 0.03401506715249039\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.35353535353535354,\n \"acc_stderr\": 0.03406086723547153,\n \"acc_norm\": 0.35353535353535354,\n \"acc_norm_stderr\": 0.03406086723547153\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.35233160621761656,\n \"acc_stderr\": 0.03447478286414359,\n \"acc_norm\": 0.35233160621761656,\n \"acc_norm_stderr\": 0.03447478286414359\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.3435897435897436,\n \"acc_stderr\": 0.02407869658063547,\n \"acc_norm\": 0.3435897435897436,\n \"acc_norm_stderr\": 0.02407869658063547\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.26296296296296295,\n \"acc_stderr\": 0.026842057873833706,\n \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.026842057873833706\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.031041941304059285,\n \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.031041941304059285\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658754,\n \"acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658754\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.3559633027522936,\n \"acc_stderr\": 0.020528559278244218,\n \"acc_norm\": 0.3559633027522936,\n \"acc_norm_stderr\": 0.020528559278244218\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.03388857118502325,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.03388857118502325\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.20675105485232068,\n \"acc_stderr\": 0.026361651668389104,\n \"acc_norm\": 0.20675105485232068,\n \"acc_norm_stderr\": 0.026361651668389104\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.12556053811659193,\n \"acc_stderr\": 0.02223898546932376,\n \"acc_norm\": 0.12556053811659193,\n \"acc_norm_stderr\": 0.02223898546932376\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2824427480916031,\n \"acc_stderr\": 0.03948406125768361,\n \"acc_norm\": 0.2824427480916031,\n \"acc_norm_stderr\": 0.03948406125768361\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.14049586776859505,\n \"acc_stderr\": 0.03172233426002161,\n \"acc_norm\": 0.14049586776859505,\n \"acc_norm_stderr\": 0.03172233426002161\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.2392638036809816,\n \"acc_stderr\": 0.033519538795212696,\n \"acc_norm\": 0.2392638036809816,\n \"acc_norm_stderr\": 0.033519538795212696\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.15178571428571427,\n \"acc_stderr\": 0.03405702838185694,\n \"acc_norm\": 0.15178571428571427,\n \"acc_norm_stderr\": 0.03405702838185694\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.3786407766990291,\n \"acc_stderr\": 0.04802694698258972,\n \"acc_norm\": 0.3786407766990291,\n \"acc_norm_stderr\": 0.04802694698258972\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.19230769230769232,\n \"acc_stderr\": 0.025819233256483706,\n \"acc_norm\": 0.19230769230769232,\n \"acc_norm_stderr\": 0.025819233256483706\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.20561941251596424,\n \"acc_stderr\": 0.014452500456785825,\n \"acc_norm\": 0.20561941251596424,\n \"acc_norm_stderr\": 0.014452500456785825\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.21965317919075145,\n \"acc_stderr\": 0.022289638852617904,\n \"acc_norm\": 0.21965317919075145,\n \"acc_norm_stderr\": 0.022289638852617904\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27262569832402234,\n \"acc_stderr\": 0.014893391735249588,\n \"acc_norm\": 0.27262569832402234,\n \"acc_norm_stderr\": 0.014893391735249588\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.3006535947712418,\n \"acc_stderr\": 0.02625605383571896,\n \"acc_norm\": 0.3006535947712418,\n \"acc_norm_stderr\": 0.02625605383571896\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.24758842443729903,\n \"acc_stderr\": 0.024513879973621967,\n \"acc_norm\": 0.24758842443729903,\n \"acc_norm_stderr\": 0.024513879973621967\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.22530864197530864,\n \"acc_stderr\": 0.023246202647819746,\n \"acc_norm\": 0.22530864197530864,\n \"acc_norm_stderr\": 0.023246202647819746\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.25177304964539005,\n \"acc_stderr\": 0.0258921511567094,\n \"acc_norm\": 0.25177304964539005,\n \"acc_norm_stderr\": 0.0258921511567094\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.23859191655801826,\n \"acc_stderr\": 0.010885929742002221,\n \"acc_norm\": 0.23859191655801826,\n \"acc_norm_stderr\": 0.010885929742002221\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.030134614954403924,\n \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.030134614954403924\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.21405228758169934,\n \"acc_stderr\": 0.01659342966232903,\n \"acc_norm\": 0.21405228758169934,\n \"acc_norm_stderr\": 0.01659342966232903\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.24545454545454545,\n \"acc_stderr\": 0.041220665028782834,\n \"acc_norm\": 0.24545454545454545,\n \"acc_norm_stderr\": 0.041220665028782834\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.39591836734693875,\n \"acc_stderr\": 0.03130802899065686,\n \"acc_norm\": 0.39591836734693875,\n \"acc_norm_stderr\": 0.03130802899065686\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.263681592039801,\n \"acc_stderr\": 0.03115715086935556,\n \"acc_norm\": 0.263681592039801,\n \"acc_norm_stderr\": 0.03115715086935556\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322674,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322674\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.21084337349397592,\n \"acc_stderr\": 0.03175554786629921,\n \"acc_norm\": 0.21084337349397592,\n \"acc_norm_stderr\": 0.03175554786629921\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.17543859649122806,\n \"acc_stderr\": 0.029170885500727654,\n \"acc_norm\": 0.17543859649122806,\n \"acc_norm_stderr\": 0.029170885500727654\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2521419828641371,\n \"mc1_stderr\": 0.015201522246299953,\n \"mc2\": 0.4880571743853537,\n \"mc2_stderr\": 0.0172850771661607\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5019731649565904,\n \"acc_stderr\": 0.014052376259225636\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/kz919/mistral-7b-dpo-open-orca-flan-50k-synthetic-5-models", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_14T18_15_50.698529", "path": ["**/details_harness|arc:challenge|25_2024-01-14T18-15-50.698529.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-14T18-15-50.698529.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_14T18_15_50.698529", "path": ["**/details_harness|gsm8k|5_2024-01-14T18-15-50.698529.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-14T18-15-50.698529.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_14T18_15_50.698529", "path": ["**/details_harness|hellaswag|10_2024-01-14T18-15-50.698529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-14T18-15-50.698529.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_14T18_15_50.698529", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T18-15-50.698529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T18-15-50.698529.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T18-15-50.698529.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_14T18_15_50.698529", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T18-15-50.698529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T18-15-50.698529.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_14T18_15_50.698529", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T18-15-50.698529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T18-15-50.698529.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_14T18_15_50.698529", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T18-15-50.698529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T18-15-50.698529.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_14T18_15_50.698529", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T18-15-50.698529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T18-15-50.698529.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_14T18_15_50.698529", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T18-15-50.698529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T18-15-50.698529.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_14T18_15_50.698529", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T18-15-50.698529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T18-15-50.698529.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_14T18_15_50.698529", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T18-15-50.698529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T18-15-50.698529.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_14T18_15_50.698529", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T18-15-50.698529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T18-15-50.698529.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_14T18_15_50.698529", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T18-15-50.698529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T18-15-50.698529.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_14T18_15_50.698529", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T18-15-50.698529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T18-15-50.698529.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_14T18_15_50.698529", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T18-15-50.698529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T18-15-50.698529.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_14T18_15_50.698529", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T18-15-50.698529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T18-15-50.698529.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_14T18_15_50.698529", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T18-15-50.698529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T18-15-50.698529.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_14T18_15_50.698529", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T18-15-50.698529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T18-15-50.698529.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_14T18_15_50.698529", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T18-15-50.698529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T18-15-50.698529.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_14T18_15_50.698529", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T18-15-50.698529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T18-15-50.698529.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_14T18_15_50.698529", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T18-15-50.698529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T18-15-50.698529.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_14T18_15_50.698529", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T18-15-50.698529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T18-15-50.698529.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_14T18_15_50.698529", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T18-15-50.698529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T18-15-50.698529.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_14T18_15_50.698529", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T18-15-50.698529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T18-15-50.698529.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_14T18_15_50.698529", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T18-15-50.698529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T18-15-50.698529.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_14T18_15_50.698529", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T18-15-50.698529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T18-15-50.698529.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_14T18_15_50.698529", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T18-15-50.698529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T18-15-50.698529.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_14T18_15_50.698529", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T18-15-50.698529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T18-15-50.698529.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_14T18_15_50.698529", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T18-15-50.698529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T18-15-50.698529.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_14T18_15_50.698529", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T18-15-50.698529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T18-15-50.698529.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_14T18_15_50.698529", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T18-15-50.698529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T18-15-50.698529.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_14T18_15_50.698529", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T18-15-50.698529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T18-15-50.698529.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_14T18_15_50.698529", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T18-15-50.698529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T18-15-50.698529.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_14T18_15_50.698529", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T18-15-50.698529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T18-15-50.698529.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_14T18_15_50.698529", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T18-15-50.698529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T18-15-50.698529.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_14T18_15_50.698529", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T18-15-50.698529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T18-15-50.698529.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_14T18_15_50.698529", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T18-15-50.698529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T18-15-50.698529.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_14T18_15_50.698529", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T18-15-50.698529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T18-15-50.698529.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_14T18_15_50.698529", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T18-15-50.698529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T18-15-50.698529.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_14T18_15_50.698529", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T18-15-50.698529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T18-15-50.698529.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_14T18_15_50.698529", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T18-15-50.698529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T18-15-50.698529.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_14T18_15_50.698529", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T18-15-50.698529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T18-15-50.698529.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_14T18_15_50.698529", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T18-15-50.698529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T18-15-50.698529.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_14T18_15_50.698529", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T18-15-50.698529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T18-15-50.698529.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_14T18_15_50.698529", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T18-15-50.698529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T18-15-50.698529.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_14T18_15_50.698529", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T18-15-50.698529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T18-15-50.698529.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_14T18_15_50.698529", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T18-15-50.698529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T18-15-50.698529.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_14T18_15_50.698529", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T18-15-50.698529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T18-15-50.698529.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_14T18_15_50.698529", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T18-15-50.698529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T18-15-50.698529.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_14T18_15_50.698529", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T18-15-50.698529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T18-15-50.698529.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_14T18_15_50.698529", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T18-15-50.698529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T18-15-50.698529.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_14T18_15_50.698529", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T18-15-50.698529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T18-15-50.698529.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_14T18_15_50.698529", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T18-15-50.698529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T18-15-50.698529.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_14T18_15_50.698529", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T18-15-50.698529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T18-15-50.698529.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_14T18_15_50.698529", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T18-15-50.698529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T18-15-50.698529.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_14T18_15_50.698529", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T18-15-50.698529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T18-15-50.698529.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_14T18_15_50.698529", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T18-15-50.698529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T18-15-50.698529.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_14T18_15_50.698529", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T18-15-50.698529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T18-15-50.698529.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_14T18_15_50.698529", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T18-15-50.698529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T18-15-50.698529.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_14T18_15_50.698529", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T18-15-50.698529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T18-15-50.698529.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_14T18_15_50.698529", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T18-15-50.698529.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T18-15-50.698529.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_14T18_15_50.698529", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T18-15-50.698529.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T18-15-50.698529.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_14T18_15_50.698529", "path": ["**/details_harness|winogrande|5_2024-01-14T18-15-50.698529.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-14T18-15-50.698529.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_14T18_15_50.698529", "path": ["results_2024-01-14T18-15-50.698529.parquet"]}, {"split": "latest", "path": ["results_2024-01-14T18-15-50.698529.parquet"]}]}]} | 2024-01-14T18:18:33+00:00 |
c36ac7830c728036ccbefc84aeecee825f46b466 | DAVIX08BR/IAdaVO | [
"region:us"
] | 2024-01-14T18:28:15+00:00 | {} | 2024-01-14T18:43:13+00:00 |
|
ce9c69275447db56a874020f8d95dfa55690073a | Vitorbr2009/voz-afauna-treinada | [
"license:openrail",
"region:us"
] | 2024-01-14T18:32:30+00:00 | {"license": "openrail"} | 2024-01-14T18:33:22+00:00 |
|
e7c6cec85137b039b0d570ba42feeb34e1706a43 | runes/3D | [
"license:cc",
"region:us"
] | 2024-01-14T18:32:36+00:00 | {"license": "cc"} | 2024-01-14T19:54:50+00:00 |
Subsets and Splits