sha
stringlengths 40
40
| text
stringlengths 0
13.4M
| id
stringlengths 2
117
| tags
sequence | created_at
stringlengths 25
25
| metadata
stringlengths 2
31.7M
| last_modified
stringlengths 25
25
|
---|---|---|---|---|---|---|
1416fda916724d8c218f585733f8b0bf02ea57b7 |
# Dataset of independence/インディペンデンス/独立 (Azur Lane)
This is the dataset of independence/インディペンデンス/独立 (Azur Lane), containing 68 images and their tags.
The core tags of this character are `breasts, red_eyes, bangs, long_hair, brown_hair, ahoge, hairband, large_breasts, hair_ornament, very_long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 68 | 93.17 MiB | [Download](https://huggingface.co/datasets/CyberHarem/independence_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 68 | 56.27 MiB | [Download](https://huggingface.co/datasets/CyberHarem/independence_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 146 | 102.14 MiB | [Download](https://huggingface.co/datasets/CyberHarem/independence_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 68 | 86.28 MiB | [Download](https://huggingface.co/datasets/CyberHarem/independence_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 146 | 139.10 MiB | [Download](https://huggingface.co/datasets/CyberHarem/independence_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/independence_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 19 |  |  |  |  |  | blush, 1girl, looking_at_viewer, solo, white_shirt, blue_skirt, pleated_skirt, school_uniform, collared_shirt, smile, earrings, long_sleeves, black_pantyhose, cardigan, collarbone, jacket, white_background, closed_mouth, hairclip, official_alternate_costume, open_clothes, shoes, simple_background, striped_necktie |
| 1 | 14 |  |  |  |  |  | 1girl, looking_at_viewer, solo, bare_shoulders, red_gloves, sideboob, skirt, rudder_footwear, thighhighs, jacket, black_hair, rigging, bow_(weapon), flight_deck, headband, off_shoulder, sleeveless |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | blush | 1girl | looking_at_viewer | solo | white_shirt | blue_skirt | pleated_skirt | school_uniform | collared_shirt | smile | earrings | long_sleeves | black_pantyhose | cardigan | collarbone | jacket | white_background | closed_mouth | hairclip | official_alternate_costume | open_clothes | shoes | simple_background | striped_necktie | bare_shoulders | red_gloves | sideboob | skirt | rudder_footwear | thighhighs | black_hair | rigging | bow_(weapon) | flight_deck | headband | off_shoulder | sleeveless |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:--------------------|:-------|:--------------|:-------------|:----------------|:-----------------|:-----------------|:--------|:-----------|:---------------|:------------------|:-----------|:-------------|:---------|:-------------------|:---------------|:-----------|:-----------------------------|:---------------|:--------|:--------------------|:------------------|:-----------------|:-------------|:-----------|:--------|:------------------|:-------------|:-------------|:----------|:---------------|:--------------|:-----------|:---------------|:-------------|
| 0 | 19 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | |
| 1 | 14 |  |  |  |  |  | | X | X | X | | | | | | | | | | | | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/independence_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T21:42:59+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T22:01:19+00:00 |
fffefea572ab55927cd35e39d0f9bfc175500cdf |
# Dataset of acasta/アカスタ/阿卡司塔 (Azur Lane)
This is the dataset of acasta/アカスタ/阿卡司塔 (Azur Lane), containing 24 images and their tags.
The core tags of this character are `black_hair, blue_eyes, bangs, breasts, short_hair, hat, multicolored_hair, blue_hair, bow, one_side_up, large_breasts, ribbon, beret, blue_bow, blunt_bangs, hair_bow, white_headwear`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 24 | 23.69 MiB | [Download](https://huggingface.co/datasets/CyberHarem/acasta_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 24 | 16.39 MiB | [Download](https://huggingface.co/datasets/CyberHarem/acasta_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 49 | 30.01 MiB | [Download](https://huggingface.co/datasets/CyberHarem/acasta_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 24 | 21.78 MiB | [Download](https://huggingface.co/datasets/CyberHarem/acasta_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 49 | 37.65 MiB | [Download](https://huggingface.co/datasets/CyberHarem/acasta_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/acasta_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, blush, looking_at_viewer, solo, white_shirt, black_skirt, closed_mouth, collared_shirt, long_sleeves, bag, blue_headwear, full_body, pleated_skirt, simple_background, black_choker, black_footwear, black_thighhighs, boots, chibi, coat, medium_hair, open_jacket, own_hands_together, shoes, sitting, smile, twitter_username, white_background |
| 1 | 6 |  |  |  |  |  | 1girl, blue_skirt, long_sleeves, simple_background, white_background, blush, looking_at_viewer, pleated_skirt, solo, white_thighhighs, frilled_skirt, cannon, garter_straps, high-waist_skirt, holding, loafers, machinery, medium_breasts, turret, white_shirt |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blush | looking_at_viewer | solo | white_shirt | black_skirt | closed_mouth | collared_shirt | long_sleeves | bag | blue_headwear | full_body | pleated_skirt | simple_background | black_choker | black_footwear | black_thighhighs | boots | chibi | coat | medium_hair | open_jacket | own_hands_together | shoes | sitting | smile | twitter_username | white_background | blue_skirt | white_thighhighs | frilled_skirt | cannon | garter_straps | high-waist_skirt | holding | loafers | machinery | medium_breasts | turret |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:--------------------|:-------|:--------------|:--------------|:---------------|:-----------------|:---------------|:------|:----------------|:------------|:----------------|:--------------------|:---------------|:-----------------|:-------------------|:--------|:--------|:-------|:--------------|:--------------|:---------------------|:--------|:----------|:--------|:-------------------|:-------------------|:-------------|:-------------------|:----------------|:---------|:----------------|:-------------------|:----------|:----------|:------------|:-----------------|:---------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | X | X | X | | | | X | | | | X | X | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/acasta_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T21:43:18+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T21:50:51+00:00 |
3ab4f21448349e6ec5c05674faa1e3ab3bf35796 |
# Dataset of spence/スペンス/斯彭斯 (Azur Lane)
This is the dataset of spence/スペンス/斯彭斯 (Azur Lane), containing 15 images and their tags.
The core tags of this character are `hair_ornament, long_hair, pink_hair, bangs, two_side_up, yellow_eyes, hat, beret`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 15 | 12.13 MiB | [Download](https://huggingface.co/datasets/CyberHarem/spence_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 15 | 8.55 MiB | [Download](https://huggingface.co/datasets/CyberHarem/spence_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 32 | 16.97 MiB | [Download](https://huggingface.co/datasets/CyberHarem/spence_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 15 | 11.57 MiB | [Download](https://huggingface.co/datasets/CyberHarem/spence_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 32 | 22.18 MiB | [Download](https://huggingface.co/datasets/CyberHarem/spence_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/spence_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 15 |  |  |  |  |  | hair_bobbles, blush, 1girl, tears, open_mouth, sleeveless, dress, simple_background, solo, hat_feather, looking_at_viewer, white_background, black_pantyhose, sailor_collar, smile |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | hair_bobbles | blush | 1girl | tears | open_mouth | sleeveless | dress | simple_background | solo | hat_feather | looking_at_viewer | white_background | black_pantyhose | sailor_collar | smile |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------|:--------|:--------|:--------|:-------------|:-------------|:--------|:--------------------|:-------|:--------------|:--------------------|:-------------------|:------------------|:----------------|:--------|
| 0 | 15 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/spence_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T21:43:21+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T21:48:17+00:00 |
060ad3df88a6ba5f5546c622652290f38e73ceba |
#### Attention: This dataset is a summary and reformat pulled from github code.
You should make your own assumptions based on this.
In fact, there is another dataset I formed through parsing that addresses several points:
- out of 500k python related items, most of them are python-ish, not pythonic
- the majority of the items here contain excessive licensing inclusion of original code
- the items here are sometimes not even python but have references
- There's a whole lot of gpl summaries floating on the code responses or instructions
As such, you are probably not getting good data to begin with, but this should be used as a starting point at best.
You have been warned.
| jtatman/python-code-dataset-500k | [
"task_categories:text-generation",
"size_categories:100K<n<1M",
"license:mit",
"instructional",
"python",
"code",
"region:us"
] | 2024-01-13T21:44:31+00:00 | {"license": "mit", "size_categories": ["100K<n<1M"], "task_categories": ["text-generation"], "pretty_name": "github_python", "dataset_info": {"features": [{"name": "output", "dtype": "string"}, {"name": "instruction", "dtype": "string"}, {"name": "system", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 922266591, "num_examples": 559515}], "download_size": 346944286, "dataset_size": 922266591}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "tags": ["instructional", "python", "code"]} | 2024-01-23T21:39:13+00:00 |
337131bee7d9e3a6fb4bb6b9c6eb8fcfb55c6575 |
# Dataset of am_ksg/AmKSG/KSG (Girls' Frontline)
This is the dataset of am_ksg/AmKSG/KSG (Girls' Frontline), containing 12 images and their tags.
The core tags of this character are `bangs, short_hair, sunglasses, white_hair, grey_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 12 | 15.85 MiB | [Download](https://huggingface.co/datasets/CyberHarem/am_ksg_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 12 | 7.90 MiB | [Download](https://huggingface.co/datasets/CyberHarem/am_ksg_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 25 | 15.12 MiB | [Download](https://huggingface.co/datasets/CyberHarem/am_ksg_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 12 | 13.40 MiB | [Download](https://huggingface.co/datasets/CyberHarem/am_ksg_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 25 | 23.94 MiB | [Download](https://huggingface.co/datasets/CyberHarem/am_ksg_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/am_ksg_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------|
| 0 | 12 |  |  |  |  |  | 1girl, jacket, solo, fingerless_gloves, hood_up, gun, looking_at_viewer |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | jacket | solo | fingerless_gloves | hood_up | gun | looking_at_viewer |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------|:-------|:--------------------|:----------|:------|:--------------------|
| 0 | 12 |  |  |  |  |  | X | X | X | X | X | X | X |
| CyberHarem/am_ksg_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T21:45:04+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T21:50:13+00:00 |
94812c6b213b2363534e7774acb4b8f3b45c5ac4 |
# Dataset of p38/P38/P38 (Girls' Frontline)
This is the dataset of p38/P38/P38 (Girls' Frontline), containing 11 images and their tags.
The core tags of this character are `brown_hair, hat, garrison_cap, military_hat, long_hair, purple_eyes, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 11 | 6.75 MiB | [Download](https://huggingface.co/datasets/CyberHarem/p38_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 11 | 5.59 MiB | [Download](https://huggingface.co/datasets/CyberHarem/p38_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 21 | 10.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/p38_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 11 | 6.48 MiB | [Download](https://huggingface.co/datasets/CyberHarem/p38_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 21 | 12.02 MiB | [Download](https://huggingface.co/datasets/CyberHarem/p38_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/p38_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 11 |  |  |  |  |  | 1girl, military_uniform, solo, belt, white_background, handgun, iron_cross, jacket, open_mouth, black_skirt, boots, holding_gun, holster, looking_at_viewer, simple_background, thighhighs, collared_shirt, pleated_skirt, pouch, walther |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | military_uniform | solo | belt | white_background | handgun | iron_cross | jacket | open_mouth | black_skirt | boots | holding_gun | holster | looking_at_viewer | simple_background | thighhighs | collared_shirt | pleated_skirt | pouch | walther |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------------------|:-------|:-------|:-------------------|:----------|:-------------|:---------|:-------------|:--------------|:--------|:--------------|:----------|:--------------------|:--------------------|:-------------|:-----------------|:----------------|:--------|:----------|
| 0 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/p38_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T21:45:07+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T21:49:00+00:00 |
2c04d53791dec5a70c2cabdb30a248f013549867 |
An augmented and further modified version of [LimaRP](https://huggingface.co/datasets/lemonilia/LimaRP) in Fastchat format, modified in the following ways:
- The first prompt is modified to add context and simple references to aspects of the conversation (OOC, use of emojis, content), include persona descriptions of the characters involved, scenario descriptions and content tags.
- Certain irrelevant tags removed from first prompt (4K, grammarchecked, etc.)
- Any placeholders replaced by randomly generated names from [Faker](https://pypi.org/project/Faker/), with proper introductions introduced in the first prompt.
- All split conversations were joined to train long-context models (you may need to re-split them to fit in context length if you are not doing this).
- The assistant never plays multiple characters and always plays only a single character consistently. The user may play multiple characters, and if this is the case, it is clearly explained in the first prompt. | grimulkan/LimaRP-augmented | [
"license:unknown",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T21:46:03+00:00 | {"license": "unknown", "tags": ["not-for-all-audiences"]} | 2024-01-24T00:01:23+00:00 |
fdf052d6709f428a80ad59e5bf006f2127ed0bf3 |
An augmented and further modified version of [Jannie-log](https://huggingface.co/datasets/v2ray/jannie-log) moxxie proxy logs in Fastchat format, modified in the following ways:
- The first prompt is modified to add context and simple references to aspects of the conversation (OOC, use of emojis, content).
- Any placeholders replaced by randomly generated names from [Faker](https://pypi.org/project/Faker/), with proper introductions introduced in the first prompt.
- All split conversations were joined to train long-context models (you may need to re-split them to fit in context length if you are not doing this) - this is the main reason you'd want to use this version of the dataset.
- Non-multiround conversations removed.
- Only English-language output is included.
- OpenAI, Anthropic, etc. refusals and moralizing statements removed. Proxy errors removed.
- Repeated requests by the user to ignore alignment are removed. You no longer need this if you are fine-tuning an uncensored base model (and they reduce the quality of the training).
- Proxy logs include lots of repeated conversations that go down different paths. All of these duplicates have been removed, keeping the longest unique path through the conversation tree.
- **Only GPT-4 output is included**. | grimulkan/jannie-log-augmented | [
"license:unknown",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T21:57:24+00:00 | {"license": "unknown", "tags": ["not-for-all-audiences"]} | 2024-01-24T00:01:12+00:00 |
d2e4deb2a604e1ce7258ee3683676b3650d95e2a |
# Dataset Card for Evaluation run of CultriX/MistralTrix-SLERP
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [CultriX/MistralTrix-SLERP](https://huggingface.co/CultriX/MistralTrix-SLERP) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CultriX__MistralTrix-SLERP",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T21:57:09.526776](https://huggingface.co/datasets/open-llm-leaderboard/details_CultriX__MistralTrix-SLERP/blob/main/results_2024-01-13T21-57-09.526776.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6548762565479429,
"acc_stderr": 0.03203510482454222,
"acc_norm": 0.65460268949256,
"acc_norm_stderr": 0.0326982976348309,
"mc1": 0.49571603427172584,
"mc1_stderr": 0.017502858577371275,
"mc2": 0.653460703870151,
"mc2_stderr": 0.015284820606060751
},
"harness|arc:challenge|25": {
"acc": 0.6843003412969283,
"acc_stderr": 0.013582571095815291,
"acc_norm": 0.7081911262798635,
"acc_norm_stderr": 0.013284525292403513
},
"harness|hellaswag|10": {
"acc": 0.6971718781119299,
"acc_stderr": 0.0045854245130121036,
"acc_norm": 0.8754232224656443,
"acc_norm_stderr": 0.0032956349076664645
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6592592592592592,
"acc_stderr": 0.04094376269996792,
"acc_norm": 0.6592592592592592,
"acc_norm_stderr": 0.04094376269996792
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7169811320754716,
"acc_stderr": 0.027724236492700914,
"acc_norm": 0.7169811320754716,
"acc_norm_stderr": 0.027724236492700914
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5957446808510638,
"acc_stderr": 0.03208115750788684,
"acc_norm": 0.5957446808510638,
"acc_norm_stderr": 0.03208115750788684
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.02546714904546955,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.02546714904546955
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677172,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.023287665127268542,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.023287665127268542
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.793939393939394,
"acc_stderr": 0.0315841532404771,
"acc_norm": 0.793939393939394,
"acc_norm_stderr": 0.0315841532404771
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.029376616484945633,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.029376616484945633
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.02150024957603348,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.02150024957603348
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402534,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402534
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3592592592592593,
"acc_stderr": 0.029252905927251972,
"acc_norm": 0.3592592592592593,
"acc_norm_stderr": 0.029252905927251972
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.030283995525884396,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.030283995525884396
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8532110091743119,
"acc_stderr": 0.01517314184512625,
"acc_norm": 0.8532110091743119,
"acc_norm_stderr": 0.01517314184512625
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49537037037037035,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.49537037037037035,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026156867523931045,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026156867523931045
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601443,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601443
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.816793893129771,
"acc_stderr": 0.03392770926494733,
"acc_norm": 0.816793893129771,
"acc_norm_stderr": 0.03392770926494733
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092375,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092375
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8326947637292464,
"acc_stderr": 0.013347327202920332,
"acc_norm": 0.8326947637292464,
"acc_norm_stderr": 0.013347327202920332
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.023445826276545543,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.023445826276545543
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4402234636871508,
"acc_stderr": 0.016602564615049935,
"acc_norm": 0.4402234636871508,
"acc_norm_stderr": 0.016602564615049935
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7320261437908496,
"acc_stderr": 0.025360603796242557,
"acc_norm": 0.7320261437908496,
"acc_norm_stderr": 0.025360603796242557
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7623456790123457,
"acc_stderr": 0.02368359183700856,
"acc_norm": 0.7623456790123457,
"acc_norm_stderr": 0.02368359183700856
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46088657105606257,
"acc_stderr": 0.012731102790504515,
"acc_norm": 0.46088657105606257,
"acc_norm_stderr": 0.012731102790504515
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6654411764705882,
"acc_stderr": 0.028661996202335303,
"acc_norm": 0.6654411764705882,
"acc_norm_stderr": 0.028661996202335303
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6830065359477124,
"acc_stderr": 0.018824219512706207,
"acc_norm": 0.6830065359477124,
"acc_norm_stderr": 0.018824219512706207
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142777,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142777
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578337,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578337
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.49571603427172584,
"mc1_stderr": 0.017502858577371275,
"mc2": 0.653460703870151,
"mc2_stderr": 0.015284820606060751
},
"harness|winogrande|5": {
"acc": 0.8168902920284136,
"acc_stderr": 0.01086977863316837
},
"harness|gsm8k|5": {
"acc": 0.711144806671721,
"acc_stderr": 0.012484219800126666
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_CultriX__MistralTrix-SLERP | [
"region:us"
] | 2024-01-13T21:59:25+00:00 | {"pretty_name": "Evaluation run of CultriX/MistralTrix-SLERP", "dataset_summary": "Dataset automatically created during the evaluation run of model [CultriX/MistralTrix-SLERP](https://huggingface.co/CultriX/MistralTrix-SLERP) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CultriX__MistralTrix-SLERP\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-13T21:57:09.526776](https://huggingface.co/datasets/open-llm-leaderboard/details_CultriX__MistralTrix-SLERP/blob/main/results_2024-01-13T21-57-09.526776.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6548762565479429,\n \"acc_stderr\": 0.03203510482454222,\n \"acc_norm\": 0.65460268949256,\n \"acc_norm_stderr\": 0.0326982976348309,\n \"mc1\": 0.49571603427172584,\n \"mc1_stderr\": 0.017502858577371275,\n \"mc2\": 0.653460703870151,\n \"mc2_stderr\": 0.015284820606060751\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6843003412969283,\n \"acc_stderr\": 0.013582571095815291,\n \"acc_norm\": 0.7081911262798635,\n \"acc_norm_stderr\": 0.013284525292403513\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6971718781119299,\n \"acc_stderr\": 0.0045854245130121036,\n \"acc_norm\": 0.8754232224656443,\n \"acc_norm_stderr\": 0.0032956349076664645\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6592592592592592,\n \"acc_stderr\": 0.04094376269996792,\n \"acc_norm\": 0.6592592592592592,\n \"acc_norm_stderr\": 0.04094376269996792\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7169811320754716,\n \"acc_stderr\": 0.027724236492700914,\n \"acc_norm\": 0.7169811320754716,\n \"acc_norm_stderr\": 0.027724236492700914\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5957446808510638,\n \"acc_stderr\": 0.03208115750788684,\n \"acc_norm\": 0.5957446808510638,\n \"acc_norm_stderr\": 0.03208115750788684\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42592592592592593,\n \"acc_stderr\": 0.02546714904546955,\n \"acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.02546714904546955\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n \"acc_stderr\": 0.023287665127268542,\n \"acc_norm\": 0.7870967741935484,\n \"acc_norm_stderr\": 0.023287665127268542\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.0315841532404771,\n \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.0315841532404771\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7828282828282829,\n \"acc_stderr\": 0.029376616484945633,\n \"acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.029376616484945633\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.02150024957603348,\n \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.02150024957603348\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402534,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402534\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3592592592592593,\n \"acc_stderr\": 0.029252905927251972,\n \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.029252905927251972\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.030283995525884396,\n \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.030283995525884396\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8532110091743119,\n \"acc_stderr\": 0.01517314184512625,\n \"acc_norm\": 0.8532110091743119,\n \"acc_norm_stderr\": 0.01517314184512625\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931045,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931045\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601443,\n \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601443\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.816793893129771,\n \"acc_stderr\": 0.03392770926494733,\n \"acc_norm\": 0.816793893129771,\n \"acc_norm_stderr\": 0.03392770926494733\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.020588491316092375,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.020588491316092375\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8326947637292464,\n \"acc_stderr\": 0.013347327202920332,\n \"acc_norm\": 0.8326947637292464,\n \"acc_norm_stderr\": 0.013347327202920332\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.023445826276545543,\n \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.023445826276545543\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4402234636871508,\n \"acc_stderr\": 0.016602564615049935,\n \"acc_norm\": 0.4402234636871508,\n \"acc_norm_stderr\": 0.016602564615049935\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7320261437908496,\n \"acc_stderr\": 0.025360603796242557,\n \"acc_norm\": 0.7320261437908496,\n \"acc_norm_stderr\": 0.025360603796242557\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7623456790123457,\n \"acc_stderr\": 0.02368359183700856,\n \"acc_norm\": 0.7623456790123457,\n \"acc_norm_stderr\": 0.02368359183700856\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46088657105606257,\n \"acc_stderr\": 0.012731102790504515,\n \"acc_norm\": 0.46088657105606257,\n \"acc_norm_stderr\": 0.012731102790504515\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6654411764705882,\n \"acc_stderr\": 0.028661996202335303,\n \"acc_norm\": 0.6654411764705882,\n \"acc_norm_stderr\": 0.028661996202335303\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6830065359477124,\n \"acc_stderr\": 0.018824219512706207,\n \"acc_norm\": 0.6830065359477124,\n \"acc_norm_stderr\": 0.018824219512706207\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142777,\n \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142777\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n \"acc_stderr\": 0.025538433368578337,\n \"acc_norm\": 0.845771144278607,\n \"acc_norm_stderr\": 0.025538433368578337\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.49571603427172584,\n \"mc1_stderr\": 0.017502858577371275,\n \"mc2\": 0.653460703870151,\n \"mc2_stderr\": 0.015284820606060751\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8168902920284136,\n \"acc_stderr\": 0.01086977863316837\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.711144806671721,\n \"acc_stderr\": 0.012484219800126666\n }\n}\n```", "repo_url": "https://huggingface.co/CultriX/MistralTrix-SLERP", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_13T21_57_09.526776", "path": ["**/details_harness|arc:challenge|25_2024-01-13T21-57-09.526776.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-13T21-57-09.526776.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_13T21_57_09.526776", "path": ["**/details_harness|gsm8k|5_2024-01-13T21-57-09.526776.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-13T21-57-09.526776.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_13T21_57_09.526776", "path": ["**/details_harness|hellaswag|10_2024-01-13T21-57-09.526776.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-13T21-57-09.526776.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_13T21_57_09.526776", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T21-57-09.526776.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T21-57-09.526776.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T21-57-09.526776.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_13T21_57_09.526776", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T21-57-09.526776.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T21-57-09.526776.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_13T21_57_09.526776", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T21-57-09.526776.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T21-57-09.526776.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_13T21_57_09.526776", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T21-57-09.526776.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T21-57-09.526776.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_13T21_57_09.526776", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T21-57-09.526776.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T21-57-09.526776.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_13T21_57_09.526776", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T21-57-09.526776.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T21-57-09.526776.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_13T21_57_09.526776", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T21-57-09.526776.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T21-57-09.526776.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_13T21_57_09.526776", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T21-57-09.526776.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T21-57-09.526776.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_13T21_57_09.526776", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T21-57-09.526776.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T21-57-09.526776.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_13T21_57_09.526776", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T21-57-09.526776.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T21-57-09.526776.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_13T21_57_09.526776", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T21-57-09.526776.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T21-57-09.526776.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_13T21_57_09.526776", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T21-57-09.526776.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T21-57-09.526776.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_13T21_57_09.526776", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T21-57-09.526776.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T21-57-09.526776.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_13T21_57_09.526776", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T21-57-09.526776.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T21-57-09.526776.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_13T21_57_09.526776", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T21-57-09.526776.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T21-57-09.526776.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_13T21_57_09.526776", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T21-57-09.526776.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T21-57-09.526776.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_13T21_57_09.526776", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T21-57-09.526776.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T21-57-09.526776.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_13T21_57_09.526776", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T21-57-09.526776.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T21-57-09.526776.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_13T21_57_09.526776", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T21-57-09.526776.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T21-57-09.526776.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_13T21_57_09.526776", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T21-57-09.526776.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T21-57-09.526776.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_13T21_57_09.526776", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T21-57-09.526776.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T21-57-09.526776.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_13T21_57_09.526776", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T21-57-09.526776.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T21-57-09.526776.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_13T21_57_09.526776", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T21-57-09.526776.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T21-57-09.526776.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_13T21_57_09.526776", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T21-57-09.526776.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T21-57-09.526776.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_13T21_57_09.526776", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T21-57-09.526776.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T21-57-09.526776.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_13T21_57_09.526776", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T21-57-09.526776.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T21-57-09.526776.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_13T21_57_09.526776", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T21-57-09.526776.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T21-57-09.526776.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_13T21_57_09.526776", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T21-57-09.526776.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T21-57-09.526776.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_13T21_57_09.526776", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T21-57-09.526776.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T21-57-09.526776.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_13T21_57_09.526776", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T21-57-09.526776.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T21-57-09.526776.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_13T21_57_09.526776", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T21-57-09.526776.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T21-57-09.526776.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_13T21_57_09.526776", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T21-57-09.526776.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T21-57-09.526776.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_13T21_57_09.526776", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T21-57-09.526776.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T21-57-09.526776.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_13T21_57_09.526776", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T21-57-09.526776.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T21-57-09.526776.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_13T21_57_09.526776", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T21-57-09.526776.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T21-57-09.526776.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_13T21_57_09.526776", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T21-57-09.526776.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T21-57-09.526776.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_13T21_57_09.526776", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T21-57-09.526776.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T21-57-09.526776.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_13T21_57_09.526776", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T21-57-09.526776.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T21-57-09.526776.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_13T21_57_09.526776", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T21-57-09.526776.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T21-57-09.526776.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_13T21_57_09.526776", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T21-57-09.526776.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T21-57-09.526776.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_13T21_57_09.526776", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T21-57-09.526776.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T21-57-09.526776.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_13T21_57_09.526776", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T21-57-09.526776.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T21-57-09.526776.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_13T21_57_09.526776", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T21-57-09.526776.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T21-57-09.526776.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_13T21_57_09.526776", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T21-57-09.526776.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T21-57-09.526776.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_13T21_57_09.526776", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T21-57-09.526776.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T21-57-09.526776.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_13T21_57_09.526776", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T21-57-09.526776.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T21-57-09.526776.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_13T21_57_09.526776", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T21-57-09.526776.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T21-57-09.526776.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_13T21_57_09.526776", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T21-57-09.526776.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T21-57-09.526776.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_13T21_57_09.526776", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T21-57-09.526776.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T21-57-09.526776.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_13T21_57_09.526776", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T21-57-09.526776.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T21-57-09.526776.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_13T21_57_09.526776", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T21-57-09.526776.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T21-57-09.526776.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_13T21_57_09.526776", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T21-57-09.526776.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T21-57-09.526776.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_13T21_57_09.526776", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T21-57-09.526776.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T21-57-09.526776.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_13T21_57_09.526776", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T21-57-09.526776.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T21-57-09.526776.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_13T21_57_09.526776", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T21-57-09.526776.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T21-57-09.526776.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_13T21_57_09.526776", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T21-57-09.526776.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T21-57-09.526776.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_13T21_57_09.526776", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T21-57-09.526776.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T21-57-09.526776.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_13T21_57_09.526776", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T21-57-09.526776.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T21-57-09.526776.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_13T21_57_09.526776", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T21-57-09.526776.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T21-57-09.526776.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_13T21_57_09.526776", "path": ["**/details_harness|winogrande|5_2024-01-13T21-57-09.526776.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-13T21-57-09.526776.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_13T21_57_09.526776", "path": ["results_2024-01-13T21-57-09.526776.parquet"]}, {"split": "latest", "path": ["results_2024-01-13T21-57-09.526776.parquet"]}]}]} | 2024-01-13T21:59:45+00:00 |
515ee956a8e610144d87924865729506a8e68827 | Muhammad89/AOT | [
"region:us"
] | 2024-01-13T22:03:47+00:00 | {} | 2024-01-14T13:36:09+00:00 |
|
a7532bce5f54ad9ee318b789092c26f8a458fdd0 |
# Dataset Card for Evaluation run of kevin009/flyingllama
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [kevin009/flyingllama](https://huggingface.co/kevin009/flyingllama) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_kevin009__flyingllama",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T22:02:33.000952](https://huggingface.co/datasets/open-llm-leaderboard/details_kevin009__flyingllama/blob/main/results_2024-01-13T22-02-33.000952.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.26132503313334,
"acc_stderr": 0.030889306167362122,
"acc_norm": 0.26324892209928613,
"acc_norm_stderr": 0.03171228882658279,
"mc1": 0.23990208078335373,
"mc1_stderr": 0.014948812679062133,
"mc2": 0.41600624216438986,
"mc2_stderr": 0.01490081265282921
},
"harness|arc:challenge|25": {
"acc": 0.21245733788395904,
"acc_stderr": 0.011953482906582949,
"acc_norm": 0.24744027303754265,
"acc_norm_stderr": 0.01261035266329267
},
"harness|hellaswag|10": {
"acc": 0.32642899820752835,
"acc_stderr": 0.004679479763516778,
"acc_norm": 0.38348934475204144,
"acc_norm_stderr": 0.004852420856631477
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.19,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.19,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.03820169914517905,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.03820169914517905
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.18421052631578946,
"acc_stderr": 0.031546980450822305,
"acc_norm": 0.18421052631578946,
"acc_norm_stderr": 0.031546980450822305
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.24150943396226415,
"acc_stderr": 0.02634148037111836,
"acc_norm": 0.24150943396226415,
"acc_norm_stderr": 0.02634148037111836
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2708333333333333,
"acc_stderr": 0.037161774375660164,
"acc_norm": 0.2708333333333333,
"acc_norm_stderr": 0.037161774375660164
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.23121387283236994,
"acc_stderr": 0.0321473730202947,
"acc_norm": 0.23121387283236994,
"acc_norm_stderr": 0.0321473730202947
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.04533838195929776,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.04533838195929776
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.21,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.21,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.19148936170212766,
"acc_stderr": 0.025722149992637798,
"acc_norm": 0.19148936170212766,
"acc_norm_stderr": 0.025722149992637798
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813344,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813344
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135303,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135303
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.022644212615525214,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.022644212615525214
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23809523809523808,
"acc_stderr": 0.03809523809523811,
"acc_norm": 0.23809523809523808,
"acc_norm_stderr": 0.03809523809523811
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.29354838709677417,
"acc_stderr": 0.02590608702131929,
"acc_norm": 0.29354838709677417,
"acc_norm_stderr": 0.02590608702131929
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.270935960591133,
"acc_stderr": 0.031270907132976984,
"acc_norm": 0.270935960591133,
"acc_norm_stderr": 0.031270907132976984
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.23030303030303031,
"acc_stderr": 0.03287666758603488,
"acc_norm": 0.23030303030303031,
"acc_norm_stderr": 0.03287666758603488
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.3484848484848485,
"acc_stderr": 0.033948539651564025,
"acc_norm": 0.3484848484848485,
"acc_norm_stderr": 0.033948539651564025
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.33678756476683935,
"acc_stderr": 0.03410780251836182,
"acc_norm": 0.33678756476683935,
"acc_norm_stderr": 0.03410780251836182
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.32564102564102565,
"acc_stderr": 0.02375966576741229,
"acc_norm": 0.32564102564102565,
"acc_norm_stderr": 0.02375966576741229
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.27037037037037037,
"acc_stderr": 0.02708037281514566,
"acc_norm": 0.27037037037037037,
"acc_norm_stderr": 0.02708037281514566
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.026653531596715466,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.026653531596715466
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.038227469376587525,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.038227469376587525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.29908256880733947,
"acc_stderr": 0.019630417285415175,
"acc_norm": 0.29908256880733947,
"acc_norm_stderr": 0.019630417285415175
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.23039215686274508,
"acc_stderr": 0.02955429260569506,
"acc_norm": 0.23039215686274508,
"acc_norm_stderr": 0.02955429260569506
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.22362869198312235,
"acc_stderr": 0.027123298205229972,
"acc_norm": 0.22362869198312235,
"acc_norm_stderr": 0.027123298205229972
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.15695067264573992,
"acc_stderr": 0.024413587174907426,
"acc_norm": 0.15695067264573992,
"acc_norm_stderr": 0.024413587174907426
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.22900763358778625,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.22900763358778625,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.35537190082644626,
"acc_stderr": 0.04369236326573981,
"acc_norm": 0.35537190082644626,
"acc_norm_stderr": 0.04369236326573981
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.04414343666854933,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.04414343666854933
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.26380368098159507,
"acc_stderr": 0.03462419931615624,
"acc_norm": 0.26380368098159507,
"acc_norm_stderr": 0.03462419931615624
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.1875,
"acc_stderr": 0.0370468111477387,
"acc_norm": 0.1875,
"acc_norm_stderr": 0.0370468111477387
},
"harness|hendrycksTest-management|5": {
"acc": 0.18446601941747573,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.18446601941747573,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.21794871794871795,
"acc_stderr": 0.02704685763071666,
"acc_norm": 0.21794871794871795,
"acc_norm_stderr": 0.02704685763071666
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.26309067688378035,
"acc_stderr": 0.015745497169049057,
"acc_norm": 0.26309067688378035,
"acc_norm_stderr": 0.015745497169049057
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.23410404624277456,
"acc_stderr": 0.022797110278071145,
"acc_norm": 0.23410404624277456,
"acc_norm_stderr": 0.022797110278071145
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808835,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808835
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.02555316999182651,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.02555316999182651
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.18971061093247588,
"acc_stderr": 0.022268196258783228,
"acc_norm": 0.18971061093247588,
"acc_norm_stderr": 0.022268196258783228
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.22839506172839505,
"acc_stderr": 0.023358211840626263,
"acc_norm": 0.22839506172839505,
"acc_norm_stderr": 0.023358211840626263
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.22695035460992907,
"acc_stderr": 0.024987106365642973,
"acc_norm": 0.22695035460992907,
"acc_norm_stderr": 0.024987106365642973
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.27183833116036504,
"acc_stderr": 0.011363135278651411,
"acc_norm": 0.27183833116036504,
"acc_norm_stderr": 0.011363135278651411
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4485294117647059,
"acc_stderr": 0.030211479609121593,
"acc_norm": 0.4485294117647059,
"acc_norm_stderr": 0.030211479609121593
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.01716058723504634,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.01716058723504634
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.22727272727272727,
"acc_stderr": 0.04013964554072773,
"acc_norm": 0.22727272727272727,
"acc_norm_stderr": 0.04013964554072773
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.31020408163265306,
"acc_stderr": 0.029613459872484378,
"acc_norm": 0.31020408163265306,
"acc_norm_stderr": 0.029613459872484378
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.27860696517412936,
"acc_stderr": 0.031700561834973086,
"acc_norm": 0.27860696517412936,
"acc_norm_stderr": 0.031700561834973086
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370519,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370519
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.29239766081871343,
"acc_stderr": 0.03488647713457923,
"acc_norm": 0.29239766081871343,
"acc_norm_stderr": 0.03488647713457923
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23990208078335373,
"mc1_stderr": 0.014948812679062133,
"mc2": 0.41600624216438986,
"mc2_stderr": 0.01490081265282921
},
"harness|winogrande|5": {
"acc": 0.5011838989739542,
"acc_stderr": 0.014052446290529019
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_kevin009__flyingllama | [
"region:us"
] | 2024-01-13T22:03:52+00:00 | {"pretty_name": "Evaluation run of kevin009/flyingllama", "dataset_summary": "Dataset automatically created during the evaluation run of model [kevin009/flyingllama](https://huggingface.co/kevin009/flyingllama) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_kevin009__flyingllama\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-13T22:02:33.000952](https://huggingface.co/datasets/open-llm-leaderboard/details_kevin009__flyingllama/blob/main/results_2024-01-13T22-02-33.000952.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.26132503313334,\n \"acc_stderr\": 0.030889306167362122,\n \"acc_norm\": 0.26324892209928613,\n \"acc_norm_stderr\": 0.03171228882658279,\n \"mc1\": 0.23990208078335373,\n \"mc1_stderr\": 0.014948812679062133,\n \"mc2\": 0.41600624216438986,\n \"mc2_stderr\": 0.01490081265282921\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.21245733788395904,\n \"acc_stderr\": 0.011953482906582949,\n \"acc_norm\": 0.24744027303754265,\n \"acc_norm_stderr\": 0.01261035266329267\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.32642899820752835,\n \"acc_stderr\": 0.004679479763516778,\n \"acc_norm\": 0.38348934475204144,\n \"acc_norm_stderr\": 0.004852420856631477\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.19,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.03820169914517905,\n \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.03820169914517905\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.18421052631578946,\n \"acc_stderr\": 0.031546980450822305,\n \"acc_norm\": 0.18421052631578946,\n \"acc_norm_stderr\": 0.031546980450822305\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.24150943396226415,\n \"acc_stderr\": 0.02634148037111836,\n \"acc_norm\": 0.24150943396226415,\n \"acc_norm_stderr\": 0.02634148037111836\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2708333333333333,\n \"acc_stderr\": 0.037161774375660164,\n \"acc_norm\": 0.2708333333333333,\n \"acc_norm_stderr\": 0.037161774375660164\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23121387283236994,\n \"acc_stderr\": 0.0321473730202947,\n \"acc_norm\": 0.23121387283236994,\n \"acc_norm_stderr\": 0.0321473730202947\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.04533838195929776,\n \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.04533838195929776\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.19148936170212766,\n \"acc_stderr\": 0.025722149992637798,\n \"acc_norm\": 0.19148936170212766,\n \"acc_norm_stderr\": 0.025722149992637798\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n \"acc_stderr\": 0.039994238792813344,\n \"acc_norm\": 0.23684210526315788,\n \"acc_norm_stderr\": 0.039994238792813344\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135303,\n \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135303\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2619047619047619,\n \"acc_stderr\": 0.022644212615525214,\n \"acc_norm\": 0.2619047619047619,\n \"acc_norm_stderr\": 0.022644212615525214\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23809523809523808,\n \"acc_stderr\": 0.03809523809523811,\n \"acc_norm\": 0.23809523809523808,\n \"acc_norm_stderr\": 0.03809523809523811\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.29354838709677417,\n \"acc_stderr\": 0.02590608702131929,\n \"acc_norm\": 0.29354838709677417,\n \"acc_norm_stderr\": 0.02590608702131929\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.270935960591133,\n \"acc_stderr\": 0.031270907132976984,\n \"acc_norm\": 0.270935960591133,\n \"acc_norm_stderr\": 0.031270907132976984\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.23030303030303031,\n \"acc_stderr\": 0.03287666758603488,\n \"acc_norm\": 0.23030303030303031,\n \"acc_norm_stderr\": 0.03287666758603488\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.3484848484848485,\n \"acc_stderr\": 0.033948539651564025,\n \"acc_norm\": 0.3484848484848485,\n \"acc_norm_stderr\": 0.033948539651564025\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.33678756476683935,\n \"acc_stderr\": 0.03410780251836182,\n \"acc_norm\": 0.33678756476683935,\n \"acc_norm_stderr\": 0.03410780251836182\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.32564102564102565,\n \"acc_stderr\": 0.02375966576741229,\n \"acc_norm\": 0.32564102564102565,\n \"acc_norm_stderr\": 0.02375966576741229\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.27037037037037037,\n \"acc_stderr\": 0.02708037281514566,\n \"acc_norm\": 0.27037037037037037,\n \"acc_norm_stderr\": 0.02708037281514566\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.21428571428571427,\n \"acc_stderr\": 0.026653531596715466,\n \"acc_norm\": 0.21428571428571427,\n \"acc_norm_stderr\": 0.026653531596715466\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.29908256880733947,\n \"acc_stderr\": 0.019630417285415175,\n \"acc_norm\": 0.29908256880733947,\n \"acc_norm_stderr\": 0.019630417285415175\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.23039215686274508,\n \"acc_stderr\": 0.02955429260569506,\n \"acc_norm\": 0.23039215686274508,\n \"acc_norm_stderr\": 0.02955429260569506\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.22362869198312235,\n \"acc_stderr\": 0.027123298205229972,\n \"acc_norm\": 0.22362869198312235,\n \"acc_norm_stderr\": 0.027123298205229972\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.15695067264573992,\n \"acc_stderr\": 0.024413587174907426,\n \"acc_norm\": 0.15695067264573992,\n \"acc_norm_stderr\": 0.024413587174907426\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.22900763358778625,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.22900763358778625,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.35537190082644626,\n \"acc_stderr\": 0.04369236326573981,\n \"acc_norm\": 0.35537190082644626,\n \"acc_norm_stderr\": 0.04369236326573981\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2962962962962963,\n \"acc_stderr\": 0.04414343666854933,\n \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.04414343666854933\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.26380368098159507,\n \"acc_stderr\": 0.03462419931615624,\n \"acc_norm\": 0.26380368098159507,\n \"acc_norm_stderr\": 0.03462419931615624\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.1875,\n \"acc_stderr\": 0.0370468111477387,\n \"acc_norm\": 0.1875,\n \"acc_norm_stderr\": 0.0370468111477387\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.18446601941747573,\n \"acc_stderr\": 0.03840423627288276,\n \"acc_norm\": 0.18446601941747573,\n \"acc_norm_stderr\": 0.03840423627288276\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.21794871794871795,\n \"acc_stderr\": 0.02704685763071666,\n \"acc_norm\": 0.21794871794871795,\n \"acc_norm_stderr\": 0.02704685763071666\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.26309067688378035,\n \"acc_stderr\": 0.015745497169049057,\n \"acc_norm\": 0.26309067688378035,\n \"acc_norm_stderr\": 0.015745497169049057\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.23410404624277456,\n \"acc_stderr\": 0.022797110278071145,\n \"acc_norm\": 0.23410404624277456,\n \"acc_norm_stderr\": 0.022797110278071145\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n \"acc_stderr\": 0.014422292204808835,\n \"acc_norm\": 0.24692737430167597,\n \"acc_norm_stderr\": 0.014422292204808835\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.02555316999182651,\n \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.02555316999182651\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.18971061093247588,\n \"acc_stderr\": 0.022268196258783228,\n \"acc_norm\": 0.18971061093247588,\n \"acc_norm_stderr\": 0.022268196258783228\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.22839506172839505,\n \"acc_stderr\": 0.023358211840626263,\n \"acc_norm\": 0.22839506172839505,\n \"acc_norm_stderr\": 0.023358211840626263\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.22695035460992907,\n \"acc_stderr\": 0.024987106365642973,\n \"acc_norm\": 0.22695035460992907,\n \"acc_norm_stderr\": 0.024987106365642973\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.27183833116036504,\n \"acc_stderr\": 0.011363135278651411,\n \"acc_norm\": 0.27183833116036504,\n \"acc_norm_stderr\": 0.011363135278651411\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.4485294117647059,\n \"acc_stderr\": 0.030211479609121593,\n \"acc_norm\": 0.4485294117647059,\n \"acc_norm_stderr\": 0.030211479609121593\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.01716058723504634,\n \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.01716058723504634\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.22727272727272727,\n \"acc_stderr\": 0.04013964554072773,\n \"acc_norm\": 0.22727272727272727,\n \"acc_norm_stderr\": 0.04013964554072773\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.31020408163265306,\n \"acc_stderr\": 0.029613459872484378,\n \"acc_norm\": 0.31020408163265306,\n \"acc_norm_stderr\": 0.029613459872484378\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.27860696517412936,\n \"acc_stderr\": 0.031700561834973086,\n \"acc_norm\": 0.27860696517412936,\n \"acc_norm_stderr\": 0.031700561834973086\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370519,\n \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370519\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.29239766081871343,\n \"acc_stderr\": 0.03488647713457923,\n \"acc_norm\": 0.29239766081871343,\n \"acc_norm_stderr\": 0.03488647713457923\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23990208078335373,\n \"mc1_stderr\": 0.014948812679062133,\n \"mc2\": 0.41600624216438986,\n \"mc2_stderr\": 0.01490081265282921\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5011838989739542,\n \"acc_stderr\": 0.014052446290529019\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/kevin009/flyingllama", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_13T22_02_33.000952", "path": ["**/details_harness|arc:challenge|25_2024-01-13T22-02-33.000952.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-13T22-02-33.000952.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_13T22_02_33.000952", "path": ["**/details_harness|gsm8k|5_2024-01-13T22-02-33.000952.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-13T22-02-33.000952.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_13T22_02_33.000952", "path": ["**/details_harness|hellaswag|10_2024-01-13T22-02-33.000952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-13T22-02-33.000952.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_13T22_02_33.000952", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T22-02-33.000952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T22-02-33.000952.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T22-02-33.000952.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_13T22_02_33.000952", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T22-02-33.000952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T22-02-33.000952.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_13T22_02_33.000952", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T22-02-33.000952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T22-02-33.000952.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_13T22_02_33.000952", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T22-02-33.000952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T22-02-33.000952.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_13T22_02_33.000952", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T22-02-33.000952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T22-02-33.000952.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_13T22_02_33.000952", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T22-02-33.000952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T22-02-33.000952.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_13T22_02_33.000952", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T22-02-33.000952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T22-02-33.000952.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_13T22_02_33.000952", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T22-02-33.000952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T22-02-33.000952.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_13T22_02_33.000952", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T22-02-33.000952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T22-02-33.000952.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_13T22_02_33.000952", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T22-02-33.000952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T22-02-33.000952.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_13T22_02_33.000952", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T22-02-33.000952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T22-02-33.000952.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_13T22_02_33.000952", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T22-02-33.000952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T22-02-33.000952.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_13T22_02_33.000952", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T22-02-33.000952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T22-02-33.000952.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_13T22_02_33.000952", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T22-02-33.000952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T22-02-33.000952.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_13T22_02_33.000952", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T22-02-33.000952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T22-02-33.000952.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_13T22_02_33.000952", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T22-02-33.000952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T22-02-33.000952.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_13T22_02_33.000952", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T22-02-33.000952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T22-02-33.000952.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_13T22_02_33.000952", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T22-02-33.000952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T22-02-33.000952.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_13T22_02_33.000952", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T22-02-33.000952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T22-02-33.000952.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_13T22_02_33.000952", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T22-02-33.000952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T22-02-33.000952.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_13T22_02_33.000952", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T22-02-33.000952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T22-02-33.000952.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_13T22_02_33.000952", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T22-02-33.000952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T22-02-33.000952.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_13T22_02_33.000952", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T22-02-33.000952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T22-02-33.000952.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_13T22_02_33.000952", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T22-02-33.000952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T22-02-33.000952.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_13T22_02_33.000952", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T22-02-33.000952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T22-02-33.000952.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_13T22_02_33.000952", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T22-02-33.000952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T22-02-33.000952.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_13T22_02_33.000952", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T22-02-33.000952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T22-02-33.000952.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_13T22_02_33.000952", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T22-02-33.000952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T22-02-33.000952.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_13T22_02_33.000952", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T22-02-33.000952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T22-02-33.000952.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_13T22_02_33.000952", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T22-02-33.000952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T22-02-33.000952.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_13T22_02_33.000952", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T22-02-33.000952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T22-02-33.000952.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_13T22_02_33.000952", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T22-02-33.000952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T22-02-33.000952.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_13T22_02_33.000952", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T22-02-33.000952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T22-02-33.000952.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_13T22_02_33.000952", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T22-02-33.000952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T22-02-33.000952.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_13T22_02_33.000952", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T22-02-33.000952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T22-02-33.000952.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_13T22_02_33.000952", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T22-02-33.000952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T22-02-33.000952.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_13T22_02_33.000952", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T22-02-33.000952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T22-02-33.000952.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_13T22_02_33.000952", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T22-02-33.000952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T22-02-33.000952.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_13T22_02_33.000952", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T22-02-33.000952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T22-02-33.000952.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_13T22_02_33.000952", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T22-02-33.000952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T22-02-33.000952.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_13T22_02_33.000952", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T22-02-33.000952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T22-02-33.000952.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_13T22_02_33.000952", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T22-02-33.000952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T22-02-33.000952.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_13T22_02_33.000952", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T22-02-33.000952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T22-02-33.000952.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_13T22_02_33.000952", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T22-02-33.000952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T22-02-33.000952.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_13T22_02_33.000952", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T22-02-33.000952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T22-02-33.000952.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_13T22_02_33.000952", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T22-02-33.000952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T22-02-33.000952.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_13T22_02_33.000952", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T22-02-33.000952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T22-02-33.000952.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_13T22_02_33.000952", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T22-02-33.000952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T22-02-33.000952.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_13T22_02_33.000952", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T22-02-33.000952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T22-02-33.000952.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_13T22_02_33.000952", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T22-02-33.000952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T22-02-33.000952.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_13T22_02_33.000952", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T22-02-33.000952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T22-02-33.000952.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_13T22_02_33.000952", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T22-02-33.000952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T22-02-33.000952.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_13T22_02_33.000952", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T22-02-33.000952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T22-02-33.000952.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_13T22_02_33.000952", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T22-02-33.000952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T22-02-33.000952.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_13T22_02_33.000952", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T22-02-33.000952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T22-02-33.000952.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_13T22_02_33.000952", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T22-02-33.000952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T22-02-33.000952.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_13T22_02_33.000952", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T22-02-33.000952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T22-02-33.000952.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_13T22_02_33.000952", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T22-02-33.000952.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T22-02-33.000952.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_13T22_02_33.000952", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T22-02-33.000952.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T22-02-33.000952.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_13T22_02_33.000952", "path": ["**/details_harness|winogrande|5_2024-01-13T22-02-33.000952.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-13T22-02-33.000952.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_13T22_02_33.000952", "path": ["results_2024-01-13T22-02-33.000952.parquet"]}, {"split": "latest", "path": ["results_2024-01-13T22-02-33.000952.parquet"]}]}]} | 2024-01-13T22:04:12+00:00 |
00de19d680c99e2d41eb97a77d37eb08bef2d624 | RealTimeData/arxiv_alltime | [
"region:us"
] | 2024-01-13T22:04:06+00:00 | {"dataset_info": [{"config_name": "2017-01", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 19895148, "num_examples": 482}], "download_size": 9877238, "dataset_size": 19895148}, {"config_name": "2017-02", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 20111448, "num_examples": 499}], "download_size": 9967413, "dataset_size": 20111448}, {"config_name": "2017-03", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 20815725, "num_examples": 500}], "download_size": 10425653, "dataset_size": 20815725}, {"config_name": "2017-04", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 21575576, "num_examples": 527}], "download_size": 10815992, "dataset_size": 21575576}, {"config_name": "2017-05", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 18573038, "num_examples": 473}], "download_size": 9309268, "dataset_size": 18573038}, {"config_name": "2017-06", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 22890828, "num_examples": 507}], "download_size": 11343584, "dataset_size": 22890828}, {"config_name": "2017-07", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 19960611, "num_examples": 493}], "download_size": 10152091, "dataset_size": 19960611}, {"config_name": "2017-08", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 19273098, "num_examples": 474}], "download_size": 9615408, "dataset_size": 19273098}, {"config_name": "2017-09", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 22552151, "num_examples": 532}], "download_size": 11305139, "dataset_size": 22552151}, {"config_name": "2017-10", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 21441238, "num_examples": 496}], "download_size": 10519666, "dataset_size": 21441238}, {"config_name": "2017-11", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 20655484, "num_examples": 520}], "download_size": 10411397, "dataset_size": 20655484}, {"config_name": "2017-12", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 19708202, "num_examples": 479}], "download_size": 9849435, "dataset_size": 19708202}, {"config_name": "2018-01", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 18090140, "num_examples": 488}], "download_size": 9163072, "dataset_size": 18090140}, {"config_name": "2018-02", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 25638031, "num_examples": 530}], "download_size": 12602449, "dataset_size": 25638031}, {"config_name": "2018-03", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 19922782, "num_examples": 512}], "download_size": 10043038, "dataset_size": 19922782}, {"config_name": "2018-04", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 20318335, "num_examples": 499}], "download_size": 10264944, "dataset_size": 20318335}, {"config_name": "2018-05", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 19116513, "num_examples": 493}], "download_size": 9561998, "dataset_size": 19116513}, {"config_name": "2018-06", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 21277471, "num_examples": 511}], "download_size": 10625238, "dataset_size": 21277471}, {"config_name": "2018-07", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 20322860, "num_examples": 517}], "download_size": 10250233, "dataset_size": 20322860}, {"config_name": "2018-08", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 20466912, "num_examples": 504}], "download_size": 10207103, "dataset_size": 20466912}, {"config_name": "2018-09", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 21521957, "num_examples": 516}], "download_size": 10292535, "dataset_size": 21521957}, {"config_name": "2018-10", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 22892365, "num_examples": 532}], "download_size": 11360268, "dataset_size": 22892365}, {"config_name": "2018-11", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 22750886, "num_examples": 531}], "download_size": 11400549, "dataset_size": 22750886}, {"config_name": "2018-12", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 19157411, "num_examples": 475}], "download_size": 9548624, "dataset_size": 19157411}, {"config_name": "2019-01", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 21024786, "num_examples": 498}], "download_size": 10499015, "dataset_size": 21024786}, {"config_name": "2019-02", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 21517028, "num_examples": 506}], "download_size": 10736779, "dataset_size": 21517028}, {"config_name": "2019-03", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 21397298, "num_examples": 500}], "download_size": 10804690, "dataset_size": 21397298}, {"config_name": "2019-04", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 23049654, "num_examples": 535}], "download_size": 11329714, "dataset_size": 23049654}, {"config_name": "2019-05", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 21896838, "num_examples": 522}], "download_size": 10901776, "dataset_size": 21896838}, {"config_name": "2019-06", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 21468690, "num_examples": 528}], "download_size": 10809206, "dataset_size": 21468690}, {"config_name": "2019-07", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 21426189, "num_examples": 545}], "download_size": 10730941, "dataset_size": 21426189}, {"config_name": "2019-08", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 21414686, "num_examples": 532}], "download_size": 10639416, "dataset_size": 21414686}, {"config_name": "2019-09", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 22329624, "num_examples": 538}], "download_size": 11263704, "dataset_size": 22329624}, {"config_name": "2019-10", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 21915199, "num_examples": 520}], "download_size": 10766785, "dataset_size": 21915199}, {"config_name": "2019-11", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 22579122, "num_examples": 547}], "download_size": 11257630, "dataset_size": 22579122}, {"config_name": "2019-12", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 21546668, "num_examples": 514}], "download_size": 10715205, "dataset_size": 21546668}, {"config_name": "2020-01", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 21724474, "num_examples": 507}], "download_size": 10799528, "dataset_size": 21724474}, {"config_name": "2020-02", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 23643655, "num_examples": 554}], "download_size": 11764632, "dataset_size": 23643655}, {"config_name": "2020-03", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 21444794, "num_examples": 519}], "download_size": 10663961, "dataset_size": 21444794}, {"config_name": "2020-04", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 21944387, "num_examples": 520}], "download_size": 10912679, "dataset_size": 21944387}, {"config_name": "2020-05", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 23067240, "num_examples": 553}], "download_size": 11652654, "dataset_size": 23067240}, {"config_name": "2020-06", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 23135770, "num_examples": 524}], "download_size": 11385738, "dataset_size": 23135770}, {"config_name": "2020-07", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 23826584, "num_examples": 537}], "download_size": 11858237, "dataset_size": 23826584}, {"config_name": "2020-08", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 23923168, "num_examples": 547}], "download_size": 12001299, "dataset_size": 23923168}, {"config_name": "2020-09", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 23329683, "num_examples": 533}], "download_size": 11503691, "dataset_size": 23329683}, {"config_name": "2020-10", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 23027955, "num_examples": 522}], "download_size": 11414934, "dataset_size": 23027955}, {"config_name": "2020-11", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 23169835, "num_examples": 523}], "download_size": 11474129, "dataset_size": 23169835}, {"config_name": "2020-12", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 22010579, "num_examples": 510}], "download_size": 10848714, "dataset_size": 22010579}, {"config_name": "2021-01", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 22878979, "num_examples": 518}], "download_size": 11395147, "dataset_size": 22878979}, {"config_name": "2021-02", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 24072264, "num_examples": 509}], "download_size": 11956929, "dataset_size": 24072264}, {"config_name": "2021-03", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 22371344, "num_examples": 520}], "download_size": 11092459, "dataset_size": 22371344}, {"config_name": "2021-04", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 24038552, "num_examples": 534}], "download_size": 11877532, "dataset_size": 24038552}, {"config_name": "2021-05", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 25134668, "num_examples": 531}], "download_size": 12442968, "dataset_size": 25134668}, {"config_name": "2021-06", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 23960150, "num_examples": 513}], "download_size": 11925496, "dataset_size": 23960150}, {"config_name": "2021-07", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 26491545, "num_examples": 544}], "download_size": 12969011, "dataset_size": 26491545}, {"config_name": "2021-08", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 22329383, "num_examples": 529}], "download_size": 11170214, "dataset_size": 22329383}, {"config_name": "2021-09", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 23242159, "num_examples": 528}], "download_size": 11552932, "dataset_size": 23242159}, {"config_name": "2021-10", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 25042107, "num_examples": 548}], "download_size": 12467001, "dataset_size": 25042107}, {"config_name": "2021-11", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 24102838, "num_examples": 526}], "download_size": 11981239, "dataset_size": 24102838}, {"config_name": "2021-12", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 22876045, "num_examples": 519}], "download_size": 11206046, "dataset_size": 22876045}, {"config_name": "2022-01", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 25170497, "num_examples": 534}], "download_size": 12517596, "dataset_size": 25170497}, {"config_name": "2022-02", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 23898715, "num_examples": 534}], "download_size": 11900408, "dataset_size": 23898715}, {"config_name": "2022-03", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 23144005, "num_examples": 527}], "download_size": 11472313, "dataset_size": 23144005}, {"config_name": "2022-04", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 23599437, "num_examples": 535}], "download_size": 11617307, "dataset_size": 23599437}, {"config_name": "2022-05", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 27224494, "num_examples": 554}], "download_size": 13511043, "dataset_size": 27224494}, {"config_name": "2022-06", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 77562488, "num_examples": 563}], "download_size": 15038893, "dataset_size": 77562488}, {"config_name": "2022-07", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 25010829, "num_examples": 541}], "download_size": 12486399, "dataset_size": 25010829}, {"config_name": "2022-08", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 23609483, "num_examples": 527}], "download_size": 11634375, "dataset_size": 23609483}, {"config_name": "2022-09", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 22995366, "num_examples": 545}], "download_size": 11403016, "dataset_size": 22995366}, {"config_name": "2022-10", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 22475875, "num_examples": 547}], "download_size": 11191644, "dataset_size": 22475875}, {"config_name": "2022-11", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 24869177, "num_examples": 535}], "download_size": 12101593, "dataset_size": 24869177}, {"config_name": "2022-12", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 22974611, "num_examples": 532}], "download_size": 11287343, "dataset_size": 22974611}, {"config_name": "2023-01", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 24450276, "num_examples": 525}], "download_size": 12026946, "dataset_size": 24450276}, {"config_name": "2023-02", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 25158757, "num_examples": 535}], "download_size": 12357634, "dataset_size": 25158757}, {"config_name": "2023-03", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 23111243, "num_examples": 550}], "download_size": 11557503, "dataset_size": 23111243}, {"config_name": "2023-04", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 24026318, "num_examples": 550}], "download_size": 11922808, "dataset_size": 24026318}, {"config_name": "2023-05", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 28626310, "num_examples": 566}], "download_size": 14071637, "dataset_size": 28626310}, {"config_name": "2023-06", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 26152528, "num_examples": 578}], "download_size": 12886392, "dataset_size": 26152528}, {"config_name": "2023-07", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 25268559, "num_examples": 561}], "download_size": 12406681, "dataset_size": 25268559}, {"config_name": "2023-08", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 24995886, "num_examples": 556}], "download_size": 12346514, "dataset_size": 24995886}, {"config_name": "2023-09", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 23490122, "num_examples": 527}], "download_size": 11671031, "dataset_size": 23490122}, {"config_name": "2023-10", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 25510139, "num_examples": 538}], "download_size": 12640473, "dataset_size": 25510139}, {"config_name": "2023-11", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 23569513, "num_examples": 548}], "download_size": 11720982, "dataset_size": 23569513}, {"config_name": "2023-12", "features": [{"name": "entry_id", "dtype": "string"}, {"name": "published", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "authors", "sequence": "string"}, {"name": "primary_category", "dtype": "string"}, {"name": "categories", "sequence": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 24828076, "num_examples": 544}], "download_size": 12153714, "dataset_size": 24828076}], "configs": [{"config_name": "2017-01", "data_files": [{"split": "train", "path": "2017-01/train-*"}]}, {"config_name": "2017-02", "data_files": [{"split": "train", "path": "2017-02/train-*"}]}, {"config_name": "2017-03", "data_files": [{"split": "train", "path": "2017-03/train-*"}]}, {"config_name": "2017-04", "data_files": [{"split": "train", "path": "2017-04/train-*"}]}, {"config_name": "2017-05", "data_files": [{"split": "train", "path": "2017-05/train-*"}]}, {"config_name": "2017-06", "data_files": [{"split": "train", "path": "2017-06/train-*"}]}, {"config_name": "2017-07", "data_files": [{"split": "train", "path": "2017-07/train-*"}]}, {"config_name": "2017-08", "data_files": [{"split": "train", "path": "2017-08/train-*"}]}, {"config_name": "2017-09", "data_files": [{"split": "train", "path": "2017-09/train-*"}]}, {"config_name": "2017-10", "data_files": [{"split": "train", "path": "2017-10/train-*"}]}, {"config_name": "2017-11", "data_files": [{"split": "train", "path": "2017-11/train-*"}]}, {"config_name": "2017-12", "data_files": [{"split": "train", "path": "2017-12/train-*"}]}, {"config_name": "2018-01", "data_files": [{"split": "train", "path": "2018-01/train-*"}]}, {"config_name": "2018-02", "data_files": [{"split": "train", "path": "2018-02/train-*"}]}, {"config_name": "2018-03", "data_files": [{"split": "train", "path": "2018-03/train-*"}]}, {"config_name": "2018-04", "data_files": [{"split": "train", "path": "2018-04/train-*"}]}, {"config_name": "2018-05", "data_files": [{"split": "train", "path": "2018-05/train-*"}]}, {"config_name": "2018-06", "data_files": [{"split": "train", "path": "2018-06/train-*"}]}, {"config_name": "2018-07", "data_files": [{"split": "train", "path": "2018-07/train-*"}]}, {"config_name": "2018-08", "data_files": [{"split": "train", "path": "2018-08/train-*"}]}, {"config_name": "2018-09", "data_files": [{"split": "train", "path": "2018-09/train-*"}]}, {"config_name": "2018-10", "data_files": [{"split": "train", "path": "2018-10/train-*"}]}, {"config_name": "2018-11", "data_files": [{"split": "train", "path": "2018-11/train-*"}]}, {"config_name": "2018-12", "data_files": [{"split": "train", "path": "2018-12/train-*"}]}, {"config_name": "2019-01", "data_files": [{"split": "train", "path": "2019-01/train-*"}]}, {"config_name": "2019-02", "data_files": [{"split": "train", "path": "2019-02/train-*"}]}, {"config_name": "2019-03", "data_files": [{"split": "train", "path": "2019-03/train-*"}]}, {"config_name": "2019-04", "data_files": [{"split": "train", "path": "2019-04/train-*"}]}, {"config_name": "2019-05", "data_files": [{"split": "train", "path": "2019-05/train-*"}]}, {"config_name": "2019-06", "data_files": [{"split": "train", "path": "2019-06/train-*"}]}, {"config_name": "2019-07", "data_files": [{"split": "train", "path": "2019-07/train-*"}]}, {"config_name": "2019-08", "data_files": [{"split": "train", "path": "2019-08/train-*"}]}, {"config_name": "2019-09", "data_files": [{"split": "train", "path": "2019-09/train-*"}]}, {"config_name": "2019-10", "data_files": [{"split": "train", "path": "2019-10/train-*"}]}, {"config_name": "2019-11", "data_files": [{"split": "train", "path": "2019-11/train-*"}]}, {"config_name": "2019-12", "data_files": [{"split": "train", "path": "2019-12/train-*"}]}, {"config_name": "2020-01", "data_files": [{"split": "train", "path": "2020-01/train-*"}]}, {"config_name": "2020-02", "data_files": [{"split": "train", "path": "2020-02/train-*"}]}, {"config_name": "2020-03", "data_files": [{"split": "train", "path": "2020-03/train-*"}]}, {"config_name": "2020-04", "data_files": [{"split": "train", "path": "2020-04/train-*"}]}, {"config_name": "2020-05", "data_files": [{"split": "train", "path": "2020-05/train-*"}]}, {"config_name": "2020-06", "data_files": [{"split": "train", "path": "2020-06/train-*"}]}, {"config_name": "2020-07", "data_files": [{"split": "train", "path": "2020-07/train-*"}]}, {"config_name": "2020-08", "data_files": [{"split": "train", "path": "2020-08/train-*"}]}, {"config_name": "2020-09", "data_files": [{"split": "train", "path": "2020-09/train-*"}]}, {"config_name": "2020-10", "data_files": [{"split": "train", "path": "2020-10/train-*"}]}, {"config_name": "2020-11", "data_files": [{"split": "train", "path": "2020-11/train-*"}]}, {"config_name": "2020-12", "data_files": [{"split": "train", "path": "2020-12/train-*"}]}, {"config_name": "2021-01", "data_files": [{"split": "train", "path": "2021-01/train-*"}]}, {"config_name": "2021-02", "data_files": [{"split": "train", "path": "2021-02/train-*"}]}, {"config_name": "2021-03", "data_files": [{"split": "train", "path": "2021-03/train-*"}]}, {"config_name": "2021-04", "data_files": [{"split": "train", "path": "2021-04/train-*"}]}, {"config_name": "2021-05", "data_files": [{"split": "train", "path": "2021-05/train-*"}]}, {"config_name": "2021-06", "data_files": [{"split": "train", "path": "2021-06/train-*"}]}, {"config_name": "2021-07", "data_files": [{"split": "train", "path": "2021-07/train-*"}]}, {"config_name": "2021-08", "data_files": [{"split": "train", "path": "2021-08/train-*"}]}, {"config_name": "2021-09", "data_files": [{"split": "train", "path": "2021-09/train-*"}]}, {"config_name": "2021-10", "data_files": [{"split": "train", "path": "2021-10/train-*"}]}, {"config_name": "2021-11", "data_files": [{"split": "train", "path": "2021-11/train-*"}]}, {"config_name": "2021-12", "data_files": [{"split": "train", "path": "2021-12/train-*"}]}, {"config_name": "2022-01", "data_files": [{"split": "train", "path": "2022-01/train-*"}]}, {"config_name": "2022-02", "data_files": [{"split": "train", "path": "2022-02/train-*"}]}, {"config_name": "2022-03", "data_files": [{"split": "train", "path": "2022-03/train-*"}]}, {"config_name": "2022-04", "data_files": [{"split": "train", "path": "2022-04/train-*"}]}, {"config_name": "2022-05", "data_files": [{"split": "train", "path": "2022-05/train-*"}]}, {"config_name": "2022-06", "data_files": [{"split": "train", "path": "2022-06/train-*"}]}, {"config_name": "2022-07", "data_files": [{"split": "train", "path": "2022-07/train-*"}]}, {"config_name": "2022-08", "data_files": [{"split": "train", "path": "2022-08/train-*"}]}, {"config_name": "2022-09", "data_files": [{"split": "train", "path": "2022-09/train-*"}]}, {"config_name": "2022-10", "data_files": [{"split": "train", "path": "2022-10/train-*"}]}, {"config_name": "2022-11", "data_files": [{"split": "train", "path": "2022-11/train-*"}]}, {"config_name": "2022-12", "data_files": [{"split": "train", "path": "2022-12/train-*"}]}, {"config_name": "2023-01", "data_files": [{"split": "train", "path": "2023-01/train-*"}]}, {"config_name": "2023-02", "data_files": [{"split": "train", "path": "2023-02/train-*"}]}, {"config_name": "2023-03", "data_files": [{"split": "train", "path": "2023-03/train-*"}]}, {"config_name": "2023-04", "data_files": [{"split": "train", "path": "2023-04/train-*"}]}, {"config_name": "2023-05", "data_files": [{"split": "train", "path": "2023-05/train-*"}]}, {"config_name": "2023-06", "data_files": [{"split": "train", "path": "2023-06/train-*"}]}, {"config_name": "2023-07", "data_files": [{"split": "train", "path": "2023-07/train-*"}]}, {"config_name": "2023-08", "data_files": [{"split": "train", "path": "2023-08/train-*"}]}, {"config_name": "2023-09", "data_files": [{"split": "train", "path": "2023-09/train-*"}]}, {"config_name": "2023-10", "data_files": [{"split": "train", "path": "2023-10/train-*"}]}, {"config_name": "2023-11", "data_files": [{"split": "train", "path": "2023-11/train-*"}]}, {"config_name": "2023-12", "data_files": [{"split": "train", "path": "2023-12/train-*"}]}]} | 2024-01-14T05:08:43+00:00 |
|
f83d258c98a04a03ac7a5d5e971f476246e48fc8 |
An augmented and further modified version of the AICG RP logs present in the [Nothing](https://huggingface.co/datasets/noznarb/nothing) archive dataset in Fastchat format, modified in the following ways:
- The first prompt is modified to add context and simple references to aspects of the conversation (OOC, use of emojis, content).
- All conversations were re-constructed into a single seamless conversation, without splits, as much as possible. This is ideal for training long-context models and the main reason you'd want to use this version of the dataset.
- Repeated conversations that go down different paths were merged, keeping the longest unique path through the conversation tree.
- Repeated requests by the user to ignore alignment are removed. You no longer need this if you are fine-tuning an uncensored base model (and they reduce the quality of the training). | grimulkan/aicg-logs-augmented | [
"license:unknown",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T22:13:05+00:00 | {"license": "unknown", "tags": ["not-for-all-audiences"]} | 2024-01-24T00:01:01+00:00 |
a499089b3aae369607cd4110ea4fa4f3e9b3a84d | Kateway/sq | [
"region:us"
] | 2024-01-13T22:16:15+00:00 | {} | 2024-01-23T11:01:36+00:00 |
|
cf49fea0b809cbd62953689d0b49ff8297883123 | bhargavi909/covid_tweets | [
"region:us"
] | 2024-01-13T22:17:02+00:00 | {} | 2024-01-13T22:17:21+00:00 |
|
997ad7ab66153279f758427131b4bfa7f76bcb07 |
# Dataset of murasaki/紫/紫 (Azur Lane)
This is the dataset of murasaki/紫/紫 (Azur Lane), containing 30 images and their tags.
The core tags of this character are `long_hair, purple_hair, breasts, purple_eyes, hair_ribbon, ribbon, large_breasts, very_long_hair, black_ribbon`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 30 | 66.04 MiB | [Download](https://huggingface.co/datasets/CyberHarem/murasaki_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 30 | 28.97 MiB | [Download](https://huggingface.co/datasets/CyberHarem/murasaki_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 77 | 63.68 MiB | [Download](https://huggingface.co/datasets/CyberHarem/murasaki_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 30 | 52.43 MiB | [Download](https://huggingface.co/datasets/CyberHarem/murasaki_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 77 | 98.70 MiB | [Download](https://huggingface.co/datasets/CyberHarem/murasaki_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/murasaki_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------|
| 0 | 12 |  |  |  |  |  | 1girl, dress, solo, looking_at_viewer, blush, jewelry, stuffed_animal |
| 1 | 6 |  |  |  |  |  | 1girl, solo, blush, cleavage, looking_at_viewer, navel, bangs, bare_shoulders, bra, collarbone, huge_breasts |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | dress | solo | looking_at_viewer | blush | jewelry | stuffed_animal | cleavage | navel | bangs | bare_shoulders | bra | collarbone | huge_breasts |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:-------|:--------------------|:--------|:----------|:-----------------|:-----------|:--------|:--------|:-----------------|:------|:-------------|:---------------|
| 0 | 12 |  |  |  |  |  | X | X | X | X | X | X | X | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | | X | X | X | | | X | X | X | X | X | X | X |
| CyberHarem/murasaki_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T22:18:15+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T22:28:51+00:00 |
5ceab1ca1f7505790c0e3295d536680bbb0a827f |
# Dataset of chao_ho/肇和/肇和 (Azur Lane)
This is the dataset of chao_ho/肇和/肇和 (Azur Lane), containing 44 images and their tags.
The core tags of this character are `multicolored_hair, two-tone_hair, white_hair, hair_bun, purple_eyes, split-color_hair, breasts, long_hair, cone_hair_bun, red_hair, double_bun, hairband, large_breasts, bangs, medium_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 44 | 70.43 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chao_ho_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 44 | 41.37 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chao_ho_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 108 | 84.03 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chao_ho_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 44 | 62.27 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chao_ho_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 108 | 116.23 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chao_ho_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/chao_ho_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 29 |  |  |  |  |  | bare_shoulders, detached_sleeves, chinese_clothes, cleavage_cutout, looking_at_viewer, wide_sleeves, 1girl, solo, blush, red_dress, single_thighhigh, white_thighhighs |
| 1 | 8 |  |  |  |  |  | 1girl, looking_at_viewer, bare_shoulders, feet, no_shoes, solo, blush, panties_under_pantyhose, thighband_pantyhose, black_pantyhose, fur_trim, toes, branch, purple_hair, soles, white_dress, ass, legs, sitting_in_tree, snow, very_long_hair, wide_sleeves |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | bare_shoulders | detached_sleeves | chinese_clothes | cleavage_cutout | looking_at_viewer | wide_sleeves | 1girl | solo | blush | red_dress | single_thighhigh | white_thighhighs | feet | no_shoes | panties_under_pantyhose | thighband_pantyhose | black_pantyhose | fur_trim | toes | branch | purple_hair | soles | white_dress | ass | legs | sitting_in_tree | snow | very_long_hair |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------|:-------------------|:------------------|:------------------|:--------------------|:---------------|:--------|:-------|:--------|:------------|:-------------------|:-------------------|:-------|:-----------|:--------------------------|:----------------------|:------------------|:-----------|:-------|:---------|:--------------|:--------|:--------------|:------|:-------|:------------------|:-------|:-----------------|
| 0 | 29 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 1 | 8 |  |  |  |  |  | X | | | | X | X | X | X | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/chao_ho_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T22:18:18+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T22:33:04+00:00 |
a671f1d9a82cc3a28c7301c328a8269baf8cd084 |
# Dataset of west_virginia/ウェストバージニア/西弗吉尼亚 (Azur Lane)
This is the dataset of west_virginia/ウェストバージニア/西弗吉尼亚 (Azur Lane), containing 23 images and their tags.
The core tags of this character are `breasts, long_hair, bangs, black_hair, red_eyes, mole, large_breasts, mole_under_eye, blue_hair, earrings, medium_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 23 | 30.72 MiB | [Download](https://huggingface.co/datasets/CyberHarem/west_virginia_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 23 | 17.23 MiB | [Download](https://huggingface.co/datasets/CyberHarem/west_virginia_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 50 | 31.90 MiB | [Download](https://huggingface.co/datasets/CyberHarem/west_virginia_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 23 | 26.41 MiB | [Download](https://huggingface.co/datasets/CyberHarem/west_virginia_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 50 | 47.27 MiB | [Download](https://huggingface.co/datasets/CyberHarem/west_virginia_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/west_virginia_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | bare_shoulders, looking_at_viewer, thighs, 1girl, mouth_mask, off_shoulder, solo, thigh_strap, very_long_hair, bare_legs, black_footwear, colored_inner_hair, covered_mouth, dress, full_body, long_legs, long_sleeves, cleavage_cutout, jacket, open_coat, panties |
| 1 | 14 |  |  |  |  |  | bare_shoulders, looking_at_viewer, 1girl, solo, dress, off_shoulder, black_gloves, coat, sleeveless, jewelry, simple_background, white_background, anchor_symbol, long_sleeves |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | bare_shoulders | looking_at_viewer | thighs | 1girl | mouth_mask | off_shoulder | solo | thigh_strap | very_long_hair | bare_legs | black_footwear | colored_inner_hair | covered_mouth | dress | full_body | long_legs | long_sleeves | cleavage_cutout | jacket | open_coat | panties | black_gloves | coat | sleeveless | jewelry | simple_background | white_background | anchor_symbol |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------|:--------------------|:---------|:--------|:-------------|:---------------|:-------|:--------------|:-----------------|:------------|:-----------------|:---------------------|:----------------|:--------|:------------|:------------|:---------------|:------------------|:---------|:------------|:----------|:---------------|:-------|:-------------|:----------|:--------------------|:-------------------|:----------------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | |
| 1 | 14 |  |  |  |  |  | X | X | | X | | X | X | | | | | | | X | | | X | | | | | X | X | X | X | X | X | X |
| CyberHarem/west_virginia_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T22:18:22+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T22:25:07+00:00 |
0f276a47d174c0d89d409a33e982503dc7430a73 |
# Dataset of mogami/最上/最上 (Azur Lane)
This is the dataset of mogami/最上/最上 (Azur Lane), containing 19 images and their tags.
The core tags of this character are `brown_hair, horns, single_horn, pointy_ears, breasts, red_eyes, long_hair, medium_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 19 | 18.63 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mogami_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 19 | 13.63 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mogami_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 39 | 23.32 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mogami_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 19 | 17.73 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mogami_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 39 | 28.64 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mogami_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/mogami_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------|
| 0 | 19 |  |  |  |  |  | 1girl, solo, looking_at_viewer, blush, detached_sleeves, thighhighs, white_background, wide_sleeves |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | blush | detached_sleeves | thighhighs | white_background | wide_sleeves |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:--------|:-------------------|:-------------|:-------------------|:---------------|
| 0 | 19 |  |  |  |  |  | X | X | X | X | X | X | X | X |
| CyberHarem/mogami_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T22:18:25+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T22:24:32+00:00 |
59935a9c094b9d3d27a3233584b1905a79e462f5 |
# Dataset of houston/ヒューストン/休斯敦 (Azur Lane)
This is the dataset of houston/ヒューストン/休斯敦 (Azur Lane), containing 16 images and their tags.
The core tags of this character are `green_eyes, pink_hair, long_hair, two_side_up, breasts, ahoge, bangs, small_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 16 | 15.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/houston_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 16 | 10.29 MiB | [Download](https://huggingface.co/datasets/CyberHarem/houston_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 39 | 22.24 MiB | [Download](https://huggingface.co/datasets/CyberHarem/houston_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 16 | 14.41 MiB | [Download](https://huggingface.co/datasets/CyberHarem/houston_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 39 | 28.78 MiB | [Download](https://huggingface.co/datasets/CyberHarem/houston_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/houston_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 16 |  |  |  |  |  | 1girl, looking_at_viewer, blush, bare_shoulders, navel, smile, solo, star_(symbol), open_mouth, collarbone, shorts, black_choker, midriff, criss-cross_halter, red_gloves, simple_background, stomach, thighhighs |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | blush | bare_shoulders | navel | smile | solo | star_(symbol) | open_mouth | collarbone | shorts | black_choker | midriff | criss-cross_halter | red_gloves | simple_background | stomach | thighhighs |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:--------|:-----------------|:--------|:--------|:-------|:----------------|:-------------|:-------------|:---------|:---------------|:----------|:---------------------|:-------------|:--------------------|:----------|:-------------|
| 0 | 16 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/houston_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T22:18:37+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T22:23:41+00:00 |
663d34a42910e3e84de4692a38bfb711ec451b26 |
# Dataset Card for Evaluation run of SanjiWatsuki/Kunoichi-DPO-v2-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [SanjiWatsuki/Kunoichi-DPO-v2-7B](https://huggingface.co/SanjiWatsuki/Kunoichi-DPO-v2-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_SanjiWatsuki__Kunoichi-DPO-v2-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-18T22:09:51.454026](https://huggingface.co/datasets/open-llm-leaderboard/details_SanjiWatsuki__Kunoichi-DPO-v2-7B/blob/main/results_2024-01-18T22-09-51.454026.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6533767863663313,
"acc_stderr": 0.0320841379180863,
"acc_norm": 0.6540292659740939,
"acc_norm_stderr": 0.03273629792079274,
"mc1": 0.5018359853121175,
"mc1_stderr": 0.017503383046877048,
"mc2": 0.6605635432197811,
"mc2_stderr": 0.015348982161720861
},
"harness|arc:challenge|25": {
"acc": 0.6689419795221843,
"acc_stderr": 0.013752062419817836,
"acc_norm": 0.6962457337883959,
"acc_norm_stderr": 0.01343890918477877
},
"harness|hellaswag|10": {
"acc": 0.7029476199960167,
"acc_stderr": 0.00456025908319737,
"acc_norm": 0.8744274048994224,
"acc_norm_stderr": 0.0033068982422344924
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.027834912527544067,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.027834912527544067
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6936416184971098,
"acc_stderr": 0.035149425512674394,
"acc_norm": 0.6936416184971098,
"acc_norm_stderr": 0.035149425512674394
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6,
"acc_stderr": 0.03202563076101735,
"acc_norm": 0.6,
"acc_norm_stderr": 0.03202563076101735
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.02530590624159063,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.02530590624159063
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677171,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677171
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.023157879349083515,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.023157879349083515
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009181,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.02937661648494563,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.02937661648494563
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.021995311963644237,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.021995311963644237
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6743589743589744,
"acc_stderr": 0.02375966576741229,
"acc_norm": 0.6743589743589744,
"acc_norm_stderr": 0.02375966576741229
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.02882088466625326,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.02882088466625326
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.02959732973097809,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.02959732973097809
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8477064220183487,
"acc_stderr": 0.015405084393157074,
"acc_norm": 0.8477064220183487,
"acc_norm_stderr": 0.015405084393157074
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.03400603625538271,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.03400603625538271
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026156867523931045,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026156867523931045
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.025955020841621112,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.025955020841621112
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596913,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596913
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037181,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037181
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7975460122699386,
"acc_stderr": 0.031570650789119005,
"acc_norm": 0.7975460122699386,
"acc_norm_stderr": 0.031570650789119005
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165616,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165616
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8314176245210728,
"acc_stderr": 0.013387895731543604,
"acc_norm": 0.8314176245210728,
"acc_norm_stderr": 0.013387895731543604
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7312138728323699,
"acc_stderr": 0.023868003262500097,
"acc_norm": 0.7312138728323699,
"acc_norm_stderr": 0.023868003262500097
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4547486033519553,
"acc_stderr": 0.016653875777524,
"acc_norm": 0.4547486033519553,
"acc_norm_stderr": 0.016653875777524
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.0256468630971379,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.0256468630971379
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6945337620578779,
"acc_stderr": 0.02616058445014045,
"acc_norm": 0.6945337620578779,
"acc_norm_stderr": 0.02616058445014045
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.75,
"acc_stderr": 0.02409347123262133,
"acc_norm": 0.75,
"acc_norm_stderr": 0.02409347123262133
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5,
"acc_stderr": 0.029827499313594685,
"acc_norm": 0.5,
"acc_norm_stderr": 0.029827499313594685
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46740547588005216,
"acc_stderr": 0.012743072942653349,
"acc_norm": 0.46740547588005216,
"acc_norm_stderr": 0.012743072942653349
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6948529411764706,
"acc_stderr": 0.027971541370170595,
"acc_norm": 0.6948529411764706,
"acc_norm_stderr": 0.027971541370170595
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6568627450980392,
"acc_stderr": 0.01920660684882536,
"acc_norm": 0.6568627450980392,
"acc_norm_stderr": 0.01920660684882536
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.02826388994378459,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.02826388994378459
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578323,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578323
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8538011695906432,
"acc_stderr": 0.027097290118070806,
"acc_norm": 0.8538011695906432,
"acc_norm_stderr": 0.027097290118070806
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5018359853121175,
"mc1_stderr": 0.017503383046877048,
"mc2": 0.6605635432197811,
"mc2_stderr": 0.015348982161720861
},
"harness|winogrande|5": {
"acc": 0.8082083662194159,
"acc_stderr": 0.011065209664659527
},
"harness|gsm8k|5": {
"acc": 0.6588324488248674,
"acc_stderr": 0.013059111935831497
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_SanjiWatsuki__Kunoichi-DPO-v2-7B | [
"region:us"
] | 2024-01-13T22:19:02+00:00 | {"pretty_name": "Evaluation run of SanjiWatsuki/Kunoichi-DPO-v2-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [SanjiWatsuki/Kunoichi-DPO-v2-7B](https://huggingface.co/SanjiWatsuki/Kunoichi-DPO-v2-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_SanjiWatsuki__Kunoichi-DPO-v2-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-18T22:09:51.454026](https://huggingface.co/datasets/open-llm-leaderboard/details_SanjiWatsuki__Kunoichi-DPO-v2-7B/blob/main/results_2024-01-18T22-09-51.454026.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6533767863663313,\n \"acc_stderr\": 0.0320841379180863,\n \"acc_norm\": 0.6540292659740939,\n \"acc_norm_stderr\": 0.03273629792079274,\n \"mc1\": 0.5018359853121175,\n \"mc1_stderr\": 0.017503383046877048,\n \"mc2\": 0.6605635432197811,\n \"mc2_stderr\": 0.015348982161720861\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6689419795221843,\n \"acc_stderr\": 0.013752062419817836,\n \"acc_norm\": 0.6962457337883959,\n \"acc_norm_stderr\": 0.01343890918477877\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7029476199960167,\n \"acc_stderr\": 0.00456025908319737,\n \"acc_norm\": 0.8744274048994224,\n \"acc_norm_stderr\": 0.0033068982422344924\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.027834912527544067,\n \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.027834912527544067\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6936416184971098,\n \"acc_stderr\": 0.035149425512674394,\n \"acc_norm\": 0.6936416184971098,\n \"acc_norm_stderr\": 0.035149425512674394\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.03202563076101735,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.03202563076101735\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4074074074074074,\n \"acc_stderr\": 0.02530590624159063,\n \"acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.02530590624159063\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n \"acc_stderr\": 0.04463112720677171,\n \"acc_norm\": 0.46825396825396826,\n \"acc_norm_stderr\": 0.04463112720677171\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n \"acc_stderr\": 0.023157879349083515,\n \"acc_norm\": 0.7903225806451613,\n \"acc_norm_stderr\": 0.023157879349083515\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009181,\n \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009181\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7828282828282829,\n \"acc_stderr\": 0.02937661648494563,\n \"acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.02937661648494563\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.021995311963644237,\n \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.021995311963644237\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6743589743589744,\n \"acc_stderr\": 0.02375966576741229,\n \"acc_norm\": 0.6743589743589744,\n \"acc_norm_stderr\": 0.02375966576741229\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.337037037037037,\n \"acc_stderr\": 0.02882088466625326,\n \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.02882088466625326\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.02959732973097809,\n \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.02959732973097809\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5370370370370371,\n \"acc_stderr\": 0.03400603625538271,\n \"acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.03400603625538271\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931045,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931045\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8016877637130801,\n \"acc_stderr\": 0.025955020841621112,\n \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.025955020841621112\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596913,\n \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596913\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n \"acc_stderr\": 0.03755265865037181,\n \"acc_norm\": 0.8148148148148148,\n \"acc_norm_stderr\": 0.03755265865037181\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7975460122699386,\n \"acc_stderr\": 0.031570650789119005,\n \"acc_norm\": 0.7975460122699386,\n \"acc_norm_stderr\": 0.031570650789119005\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n \"acc_stderr\": 0.022209309073165616,\n \"acc_norm\": 0.8675213675213675,\n \"acc_norm_stderr\": 0.022209309073165616\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8314176245210728,\n \"acc_stderr\": 0.013387895731543604,\n \"acc_norm\": 0.8314176245210728,\n \"acc_norm_stderr\": 0.013387895731543604\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7312138728323699,\n \"acc_stderr\": 0.023868003262500097,\n \"acc_norm\": 0.7312138728323699,\n \"acc_norm_stderr\": 0.023868003262500097\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4547486033519553,\n \"acc_stderr\": 0.016653875777524,\n \"acc_norm\": 0.4547486033519553,\n \"acc_norm_stderr\": 0.016653875777524\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.0256468630971379,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.0256468630971379\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6945337620578779,\n \"acc_stderr\": 0.02616058445014045,\n \"acc_norm\": 0.6945337620578779,\n \"acc_norm_stderr\": 0.02616058445014045\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.029827499313594685,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.029827499313594685\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46740547588005216,\n \"acc_stderr\": 0.012743072942653349,\n \"acc_norm\": 0.46740547588005216,\n \"acc_norm_stderr\": 0.012743072942653349\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6948529411764706,\n \"acc_stderr\": 0.027971541370170595,\n \"acc_norm\": 0.6948529411764706,\n \"acc_norm_stderr\": 0.027971541370170595\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6568627450980392,\n \"acc_stderr\": 0.01920660684882536,\n \"acc_norm\": 0.6568627450980392,\n \"acc_norm_stderr\": 0.01920660684882536\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.02826388994378459,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.02826388994378459\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n \"acc_stderr\": 0.025538433368578323,\n \"acc_norm\": 0.845771144278607,\n \"acc_norm_stderr\": 0.025538433368578323\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8538011695906432,\n \"acc_stderr\": 0.027097290118070806,\n \"acc_norm\": 0.8538011695906432,\n \"acc_norm_stderr\": 0.027097290118070806\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5018359853121175,\n \"mc1_stderr\": 0.017503383046877048,\n \"mc2\": 0.6605635432197811,\n \"mc2_stderr\": 0.015348982161720861\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8082083662194159,\n \"acc_stderr\": 0.011065209664659527\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6588324488248674,\n \"acc_stderr\": 0.013059111935831497\n }\n}\n```", "repo_url": "https://huggingface.co/SanjiWatsuki/Kunoichi-DPO-v2-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_13T22_16_41.700572", "path": ["**/details_harness|arc:challenge|25_2024-01-13T22-16-41.700572.parquet"]}, {"split": "2024_01_18T22_09_51.454026", "path": ["**/details_harness|arc:challenge|25_2024-01-18T22-09-51.454026.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-18T22-09-51.454026.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_13T22_16_41.700572", "path": ["**/details_harness|gsm8k|5_2024-01-13T22-16-41.700572.parquet"]}, {"split": "2024_01_18T22_09_51.454026", "path": ["**/details_harness|gsm8k|5_2024-01-18T22-09-51.454026.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-18T22-09-51.454026.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_13T22_16_41.700572", "path": ["**/details_harness|hellaswag|10_2024-01-13T22-16-41.700572.parquet"]}, {"split": "2024_01_18T22_09_51.454026", "path": ["**/details_harness|hellaswag|10_2024-01-18T22-09-51.454026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-18T22-09-51.454026.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_13T22_16_41.700572", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T22-16-41.700572.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T22-16-41.700572.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T22-16-41.700572.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T22-16-41.700572.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T22-16-41.700572.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T22-16-41.700572.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T22-16-41.700572.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T22-16-41.700572.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T22-16-41.700572.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T22-16-41.700572.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T22-16-41.700572.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T22-16-41.700572.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T22-16-41.700572.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T22-16-41.700572.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T22-16-41.700572.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T22-16-41.700572.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T22-16-41.700572.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T22-16-41.700572.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T22-16-41.700572.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T22-16-41.700572.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T22-16-41.700572.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T22-16-41.700572.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T22-16-41.700572.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T22-16-41.700572.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T22-16-41.700572.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T22-16-41.700572.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T22-16-41.700572.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T22-16-41.700572.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T22-16-41.700572.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T22-16-41.700572.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T22-16-41.700572.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T22-16-41.700572.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T22-16-41.700572.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T22-16-41.700572.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T22-16-41.700572.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T22-16-41.700572.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T22-16-41.700572.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T22-16-41.700572.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T22-16-41.700572.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T22-16-41.700572.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T22-16-41.700572.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T22-16-41.700572.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T22-16-41.700572.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T22-16-41.700572.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T22-16-41.700572.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T22-16-41.700572.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T22-16-41.700572.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T22-16-41.700572.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T22-16-41.700572.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T22-16-41.700572.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T22-16-41.700572.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T22-16-41.700572.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T22-16-41.700572.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T22-16-41.700572.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T22-16-41.700572.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T22-16-41.700572.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T22-16-41.700572.parquet"]}, {"split": "2024_01_18T22_09_51.454026", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-18T22-09-51.454026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-18T22-09-51.454026.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-18T22-09-51.454026.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_13T22_16_41.700572", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T22-16-41.700572.parquet"]}, {"split": "2024_01_18T22_09_51.454026", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T22-09-51.454026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T22-09-51.454026.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_13T22_16_41.700572", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T22-16-41.700572.parquet"]}, {"split": "2024_01_18T22_09_51.454026", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-18T22-09-51.454026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-18T22-09-51.454026.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_13T22_16_41.700572", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T22-16-41.700572.parquet"]}, {"split": "2024_01_18T22_09_51.454026", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-18T22-09-51.454026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-18T22-09-51.454026.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_13T22_16_41.700572", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T22-16-41.700572.parquet"]}, {"split": "2024_01_18T22_09_51.454026", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T22-09-51.454026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T22-09-51.454026.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_13T22_16_41.700572", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T22-16-41.700572.parquet"]}, {"split": "2024_01_18T22_09_51.454026", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T22-09-51.454026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T22-09-51.454026.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_13T22_16_41.700572", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T22-16-41.700572.parquet"]}, {"split": "2024_01_18T22_09_51.454026", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-18T22-09-51.454026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-18T22-09-51.454026.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_13T22_16_41.700572", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T22-16-41.700572.parquet"]}, {"split": "2024_01_18T22_09_51.454026", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T22-09-51.454026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T22-09-51.454026.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_13T22_16_41.700572", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T22-16-41.700572.parquet"]}, {"split": "2024_01_18T22_09_51.454026", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T22-09-51.454026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T22-09-51.454026.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_13T22_16_41.700572", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T22-16-41.700572.parquet"]}, {"split": "2024_01_18T22_09_51.454026", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T22-09-51.454026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T22-09-51.454026.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_13T22_16_41.700572", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T22-16-41.700572.parquet"]}, {"split": "2024_01_18T22_09_51.454026", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T22-09-51.454026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T22-09-51.454026.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_13T22_16_41.700572", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T22-16-41.700572.parquet"]}, {"split": "2024_01_18T22_09_51.454026", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-18T22-09-51.454026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-18T22-09-51.454026.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_13T22_16_41.700572", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T22-16-41.700572.parquet"]}, {"split": "2024_01_18T22_09_51.454026", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-18T22-09-51.454026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-18T22-09-51.454026.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_13T22_16_41.700572", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T22-16-41.700572.parquet"]}, {"split": "2024_01_18T22_09_51.454026", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T22-09-51.454026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T22-09-51.454026.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_13T22_16_41.700572", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T22-16-41.700572.parquet"]}, {"split": "2024_01_18T22_09_51.454026", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-18T22-09-51.454026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-18T22-09-51.454026.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_13T22_16_41.700572", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T22-16-41.700572.parquet"]}, {"split": "2024_01_18T22_09_51.454026", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T22-09-51.454026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T22-09-51.454026.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_13T22_16_41.700572", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T22-16-41.700572.parquet"]}, {"split": "2024_01_18T22_09_51.454026", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T22-09-51.454026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T22-09-51.454026.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_13T22_16_41.700572", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T22-16-41.700572.parquet"]}, {"split": "2024_01_18T22_09_51.454026", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T22-09-51.454026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T22-09-51.454026.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_13T22_16_41.700572", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T22-16-41.700572.parquet"]}, {"split": "2024_01_18T22_09_51.454026", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-18T22-09-51.454026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-18T22-09-51.454026.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_13T22_16_41.700572", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T22-16-41.700572.parquet"]}, {"split": "2024_01_18T22_09_51.454026", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T22-09-51.454026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T22-09-51.454026.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_13T22_16_41.700572", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T22-16-41.700572.parquet"]}, {"split": "2024_01_18T22_09_51.454026", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T22-09-51.454026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T22-09-51.454026.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_13T22_16_41.700572", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T22-16-41.700572.parquet"]}, {"split": "2024_01_18T22_09_51.454026", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T22-09-51.454026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T22-09-51.454026.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_13T22_16_41.700572", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T22-16-41.700572.parquet"]}, {"split": "2024_01_18T22_09_51.454026", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T22-09-51.454026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T22-09-51.454026.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_13T22_16_41.700572", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T22-16-41.700572.parquet"]}, {"split": "2024_01_18T22_09_51.454026", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T22-09-51.454026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T22-09-51.454026.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_13T22_16_41.700572", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T22-16-41.700572.parquet"]}, {"split": "2024_01_18T22_09_51.454026", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T22-09-51.454026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T22-09-51.454026.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_13T22_16_41.700572", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T22-16-41.700572.parquet"]}, {"split": "2024_01_18T22_09_51.454026", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T22-09-51.454026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T22-09-51.454026.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_13T22_16_41.700572", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T22-16-41.700572.parquet"]}, {"split": "2024_01_18T22_09_51.454026", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T22-09-51.454026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T22-09-51.454026.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_13T22_16_41.700572", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T22-16-41.700572.parquet"]}, {"split": "2024_01_18T22_09_51.454026", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T22-09-51.454026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T22-09-51.454026.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_13T22_16_41.700572", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T22-16-41.700572.parquet"]}, {"split": "2024_01_18T22_09_51.454026", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T22-09-51.454026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T22-09-51.454026.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_13T22_16_41.700572", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T22-16-41.700572.parquet"]}, {"split": "2024_01_18T22_09_51.454026", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T22-09-51.454026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T22-09-51.454026.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_13T22_16_41.700572", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T22-16-41.700572.parquet"]}, {"split": "2024_01_18T22_09_51.454026", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T22-09-51.454026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T22-09-51.454026.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_13T22_16_41.700572", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T22-16-41.700572.parquet"]}, {"split": "2024_01_18T22_09_51.454026", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T22-09-51.454026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T22-09-51.454026.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_13T22_16_41.700572", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T22-16-41.700572.parquet"]}, {"split": "2024_01_18T22_09_51.454026", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T22-09-51.454026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T22-09-51.454026.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_13T22_16_41.700572", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T22-16-41.700572.parquet"]}, {"split": "2024_01_18T22_09_51.454026", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-18T22-09-51.454026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-18T22-09-51.454026.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_13T22_16_41.700572", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T22-16-41.700572.parquet"]}, {"split": "2024_01_18T22_09_51.454026", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T22-09-51.454026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T22-09-51.454026.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_13T22_16_41.700572", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T22-16-41.700572.parquet"]}, {"split": "2024_01_18T22_09_51.454026", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-18T22-09-51.454026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-18T22-09-51.454026.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_13T22_16_41.700572", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T22-16-41.700572.parquet"]}, {"split": "2024_01_18T22_09_51.454026", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T22-09-51.454026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T22-09-51.454026.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_13T22_16_41.700572", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T22-16-41.700572.parquet"]}, {"split": "2024_01_18T22_09_51.454026", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T22-09-51.454026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T22-09-51.454026.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_13T22_16_41.700572", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T22-16-41.700572.parquet"]}, {"split": "2024_01_18T22_09_51.454026", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T22-09-51.454026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T22-09-51.454026.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_13T22_16_41.700572", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T22-16-41.700572.parquet"]}, {"split": "2024_01_18T22_09_51.454026", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-18T22-09-51.454026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-18T22-09-51.454026.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_13T22_16_41.700572", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T22-16-41.700572.parquet"]}, {"split": "2024_01_18T22_09_51.454026", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-18T22-09-51.454026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-18T22-09-51.454026.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_13T22_16_41.700572", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T22-16-41.700572.parquet"]}, {"split": "2024_01_18T22_09_51.454026", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T22-09-51.454026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T22-09-51.454026.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_13T22_16_41.700572", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T22-16-41.700572.parquet"]}, {"split": "2024_01_18T22_09_51.454026", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T22-09-51.454026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T22-09-51.454026.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_13T22_16_41.700572", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T22-16-41.700572.parquet"]}, {"split": "2024_01_18T22_09_51.454026", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T22-09-51.454026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T22-09-51.454026.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_13T22_16_41.700572", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T22-16-41.700572.parquet"]}, {"split": "2024_01_18T22_09_51.454026", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T22-09-51.454026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T22-09-51.454026.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_13T22_16_41.700572", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T22-16-41.700572.parquet"]}, {"split": "2024_01_18T22_09_51.454026", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-18T22-09-51.454026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-18T22-09-51.454026.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_13T22_16_41.700572", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T22-16-41.700572.parquet"]}, {"split": "2024_01_18T22_09_51.454026", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-18T22-09-51.454026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-18T22-09-51.454026.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_13T22_16_41.700572", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T22-16-41.700572.parquet"]}, {"split": "2024_01_18T22_09_51.454026", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-18T22-09-51.454026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-18T22-09-51.454026.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_13T22_16_41.700572", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T22-16-41.700572.parquet"]}, {"split": "2024_01_18T22_09_51.454026", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T22-09-51.454026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T22-09-51.454026.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_13T22_16_41.700572", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T22-16-41.700572.parquet"]}, {"split": "2024_01_18T22_09_51.454026", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-18T22-09-51.454026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-18T22-09-51.454026.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_13T22_16_41.700572", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T22-16-41.700572.parquet"]}, {"split": "2024_01_18T22_09_51.454026", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T22-09-51.454026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T22-09-51.454026.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_13T22_16_41.700572", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T22-16-41.700572.parquet"]}, {"split": "2024_01_18T22_09_51.454026", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T22-09-51.454026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T22-09-51.454026.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_13T22_16_41.700572", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T22-16-41.700572.parquet"]}, {"split": "2024_01_18T22_09_51.454026", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-18T22-09-51.454026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-18T22-09-51.454026.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_13T22_16_41.700572", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T22-16-41.700572.parquet"]}, {"split": "2024_01_18T22_09_51.454026", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-18T22-09-51.454026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-18T22-09-51.454026.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_13T22_16_41.700572", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T22-16-41.700572.parquet"]}, {"split": "2024_01_18T22_09_51.454026", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-18T22-09-51.454026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-18T22-09-51.454026.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_13T22_16_41.700572", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T22-16-41.700572.parquet"]}, {"split": "2024_01_18T22_09_51.454026", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T22-09-51.454026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T22-09-51.454026.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_13T22_16_41.700572", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T22-16-41.700572.parquet"]}, {"split": "2024_01_18T22_09_51.454026", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-18T22-09-51.454026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-18T22-09-51.454026.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_13T22_16_41.700572", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T22-16-41.700572.parquet"]}, {"split": "2024_01_18T22_09_51.454026", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-18T22-09-51.454026.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-18T22-09-51.454026.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_13T22_16_41.700572", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T22-16-41.700572.parquet"]}, {"split": "2024_01_18T22_09_51.454026", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-18T22-09-51.454026.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-18T22-09-51.454026.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_13T22_16_41.700572", "path": ["**/details_harness|winogrande|5_2024-01-13T22-16-41.700572.parquet"]}, {"split": "2024_01_18T22_09_51.454026", "path": ["**/details_harness|winogrande|5_2024-01-18T22-09-51.454026.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-18T22-09-51.454026.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_13T22_16_41.700572", "path": ["results_2024-01-13T22-16-41.700572.parquet"]}, {"split": "2024_01_18T22_09_51.454026", "path": ["results_2024-01-18T22-09-51.454026.parquet"]}, {"split": "latest", "path": ["results_2024-01-18T22-09-51.454026.parquet"]}]}]} | 2024-01-18T22:12:33+00:00 |
e60fbbde63d74523599d9f2ac0684a33b68418f9 |
# Dataset of gsh_18/GSh-18/GSh-18 (Girls' Frontline)
This is the dataset of gsh_18/GSh-18/GSh-18 (Girls' Frontline), containing 31 images and their tags.
The core tags of this character are `black_hair, hair_ornament, red_eyes, ahoge, bangs, hairclip, ponytail, bow, breasts, hat, nurse_cap`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 31 | 16.76 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gsh_18_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 31 | 13.52 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gsh_18_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 66 | 25.71 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gsh_18_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 31 | 16.41 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gsh_18_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 66 | 29.16 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gsh_18_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/gsh_18_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 |  |  |  |  |  | white_gloves, nurse, 1girl, blush, solo, alternate_costume, apron, side_ponytail, armband, looking_at_viewer, open_mouth |
| 1 | 6 |  |  |  |  |  | 1girl, looking_at_viewer, solo, white_gloves, blush, hair_ribbon, open_mouth, short_sleeves, white_background, white_pantyhose, bag, black_skirt, pleated_skirt |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | white_gloves | nurse | 1girl | blush | solo | alternate_costume | apron | side_ponytail | armband | looking_at_viewer | open_mouth | hair_ribbon | short_sleeves | white_background | white_pantyhose | bag | black_skirt | pleated_skirt |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------|:--------|:--------|:--------|:-------|:--------------------|:--------|:----------------|:----------|:--------------------|:-------------|:--------------|:----------------|:-------------------|:------------------|:------|:--------------|:----------------|
| 0 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | | X | X | X | | | | | X | X | X | X | X | X | X | X | X |
| CyberHarem/gsh_18_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T22:21:38+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T22:25:59+00:00 |
1f4b2e7a8a74a2d77dd57a2ca43ab47b5bd370ee |
# Dataset of usas_12/USAS-12/USAS-12 (Girls' Frontline)
This is the dataset of usas_12/USAS-12/USAS-12 (Girls' Frontline), containing 23 images and their tags.
The core tags of this character are `long_hair, purple_eyes, hat, bangs, very_long_hair, beret, grey_hair, ribbon, hair_between_eyes, black_headwear`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 23 | 33.09 MiB | [Download](https://huggingface.co/datasets/CyberHarem/usas_12_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 23 | 20.38 MiB | [Download](https://huggingface.co/datasets/CyberHarem/usas_12_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 50 | 39.62 MiB | [Download](https://huggingface.co/datasets/CyberHarem/usas_12_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 23 | 29.91 MiB | [Download](https://huggingface.co/datasets/CyberHarem/usas_12_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 50 | 54.97 MiB | [Download](https://huggingface.co/datasets/CyberHarem/usas_12_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/usas_12_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 23 |  |  |  |  |  | 1girl, looking_at_viewer, fingerless_gloves, necktie, solo, black_gloves, blush, black_jacket, pleated_skirt, black_skirt, gun, short_sleeves, smile, white_thighhighs, closed_mouth, collared_shirt, holding, open_jacket, white_shirt |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | fingerless_gloves | necktie | solo | black_gloves | blush | black_jacket | pleated_skirt | black_skirt | gun | short_sleeves | smile | white_thighhighs | closed_mouth | collared_shirt | holding | open_jacket | white_shirt |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:--------------------|:----------|:-------|:---------------|:--------|:---------------|:----------------|:--------------|:------|:----------------|:--------|:-------------------|:---------------|:-----------------|:----------|:--------------|:--------------|
| 0 | 23 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/usas_12_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T22:21:39+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T22:26:51+00:00 |
e88e32572b74d072b9f486a40cf1cc1815a28e30 | heitorzim/projetosdevoz | [
"license:openrail",
"region:us"
] | 2024-01-13T22:24:04+00:00 | {"license": "openrail"} | 2024-01-13T22:58:57+00:00 |
|
db6fda4311bd7f3b36acf9e8c4b2255d98c133fc |
# Dataset Card for Evaluation run of chanwit/flux-7b-v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [chanwit/flux-7b-v0.1](https://huggingface.co/chanwit/flux-7b-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_chanwit__flux-7b-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T22:25:20.507875](https://huggingface.co/datasets/open-llm-leaderboard/details_chanwit__flux-7b-v0.1/blob/main/results_2024-01-13T22-25-20.507875.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6576135500935923,
"acc_stderr": 0.03184575998004267,
"acc_norm": 0.6577957033968994,
"acc_norm_stderr": 0.03249734268240439,
"mc1": 0.3733170134638923,
"mc1_stderr": 0.01693237055757063,
"mc2": 0.5505210495184077,
"mc2_stderr": 0.015617590489404845
},
"harness|arc:challenge|25": {
"acc": 0.6407849829351536,
"acc_stderr": 0.014020224155839159,
"acc_norm": 0.6706484641638225,
"acc_norm_stderr": 0.013734057652635474
},
"harness|hellaswag|10": {
"acc": 0.6820354511053575,
"acc_stderr": 0.004647338877642188,
"acc_norm": 0.8617805218084047,
"acc_norm_stderr": 0.0034442484997916556
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7171052631578947,
"acc_stderr": 0.03665349695640767,
"acc_norm": 0.7171052631578947,
"acc_norm_stderr": 0.03665349695640767
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.02815283794249387,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.02815283794249387
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5872340425531914,
"acc_stderr": 0.03218471141400351,
"acc_norm": 0.5872340425531914,
"acc_norm_stderr": 0.03218471141400351
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.02548718714785938,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.02548718714785938
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7935483870967742,
"acc_stderr": 0.023025899617188712,
"acc_norm": 0.7935483870967742,
"acc_norm_stderr": 0.023025899617188712
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.029620227874790482,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.029620227874790482
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.020986854593289733,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.020986854593289733
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6692307692307692,
"acc_stderr": 0.023854795680971128,
"acc_norm": 0.6692307692307692,
"acc_norm_stderr": 0.023854795680971128
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.02944316932303154,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.02944316932303154
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6974789915966386,
"acc_stderr": 0.029837962388291932,
"acc_norm": 0.6974789915966386,
"acc_norm_stderr": 0.029837962388291932
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.0386155754625517,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.0386155754625517
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.015555802713590167,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.015555802713590167
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.034086558679777494,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.034086558679777494
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455335,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455335
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.03498149385462472,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.03498149385462472
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.03520893951097653,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.03520893951097653
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7914110429447853,
"acc_stderr": 0.03192193448934725,
"acc_norm": 0.7914110429447853,
"acc_norm_stderr": 0.03192193448934725
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.8446601941747572,
"acc_stderr": 0.035865947385739755,
"acc_norm": 0.8446601941747572,
"acc_norm_stderr": 0.035865947385739755
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.023086635086841407,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.023086635086841407
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8301404853128991,
"acc_stderr": 0.013428186370608304,
"acc_norm": 0.8301404853128991,
"acc_norm_stderr": 0.013428186370608304
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7427745664739884,
"acc_stderr": 0.023532925431044283,
"acc_norm": 0.7427745664739884,
"acc_norm_stderr": 0.023532925431044283
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3843575418994413,
"acc_stderr": 0.016269088663959402,
"acc_norm": 0.3843575418994413,
"acc_norm_stderr": 0.016269088663959402
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.738562091503268,
"acc_stderr": 0.025160998214292452,
"acc_norm": 0.738562091503268,
"acc_norm_stderr": 0.025160998214292452
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7234726688102894,
"acc_stderr": 0.025403832978179604,
"acc_norm": 0.7234726688102894,
"acc_norm_stderr": 0.025403832978179604
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600712995,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600712995
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4830508474576271,
"acc_stderr": 0.012762896889210864,
"acc_norm": 0.4830508474576271,
"acc_norm_stderr": 0.012762896889210864
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7205882352941176,
"acc_stderr": 0.027257202606114944,
"acc_norm": 0.7205882352941176,
"acc_norm_stderr": 0.027257202606114944
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6862745098039216,
"acc_stderr": 0.01877168389352818,
"acc_norm": 0.6862745098039216,
"acc_norm_stderr": 0.01877168389352818
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784593,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784593
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8656716417910447,
"acc_stderr": 0.024112678240900808,
"acc_norm": 0.8656716417910447,
"acc_norm_stderr": 0.024112678240900808
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.847953216374269,
"acc_stderr": 0.027539122889061463,
"acc_norm": 0.847953216374269,
"acc_norm_stderr": 0.027539122889061463
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3733170134638923,
"mc1_stderr": 0.01693237055757063,
"mc2": 0.5505210495184077,
"mc2_stderr": 0.015617590489404845
},
"harness|winogrande|5": {
"acc": 0.7900552486187845,
"acc_stderr": 0.01144628062926263
},
"harness|gsm8k|5": {
"acc": 0.7240333586050038,
"acc_stderr": 0.012312603010427352
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_chanwit__flux-7b-v0.1 | [
"region:us"
] | 2024-01-13T22:27:49+00:00 | {"pretty_name": "Evaluation run of chanwit/flux-7b-v0.1", "dataset_summary": "Dataset automatically created during the evaluation run of model [chanwit/flux-7b-v0.1](https://huggingface.co/chanwit/flux-7b-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_chanwit__flux-7b-v0.1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-13T22:25:20.507875](https://huggingface.co/datasets/open-llm-leaderboard/details_chanwit__flux-7b-v0.1/blob/main/results_2024-01-13T22-25-20.507875.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6576135500935923,\n \"acc_stderr\": 0.03184575998004267,\n \"acc_norm\": 0.6577957033968994,\n \"acc_norm_stderr\": 0.03249734268240439,\n \"mc1\": 0.3733170134638923,\n \"mc1_stderr\": 0.01693237055757063,\n \"mc2\": 0.5505210495184077,\n \"mc2_stderr\": 0.015617590489404845\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6407849829351536,\n \"acc_stderr\": 0.014020224155839159,\n \"acc_norm\": 0.6706484641638225,\n \"acc_norm_stderr\": 0.013734057652635474\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6820354511053575,\n \"acc_stderr\": 0.004647338877642188,\n \"acc_norm\": 0.8617805218084047,\n \"acc_norm_stderr\": 0.0034442484997916556\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7171052631578947,\n \"acc_stderr\": 0.03665349695640767,\n \"acc_norm\": 0.7171052631578947,\n \"acc_norm_stderr\": 0.03665349695640767\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.02815283794249387,\n \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.02815283794249387\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5872340425531914,\n \"acc_stderr\": 0.03218471141400351,\n \"acc_norm\": 0.5872340425531914,\n \"acc_norm_stderr\": 0.03218471141400351\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.02548718714785938,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.02548718714785938\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7935483870967742,\n \"acc_stderr\": 0.023025899617188712,\n \"acc_norm\": 0.7935483870967742,\n \"acc_norm_stderr\": 0.023025899617188712\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.029620227874790482,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.029620227874790482\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.020986854593289733,\n \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.020986854593289733\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6692307692307692,\n \"acc_stderr\": 0.023854795680971128,\n \"acc_norm\": 0.6692307692307692,\n \"acc_norm_stderr\": 0.023854795680971128\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.37037037037037035,\n \"acc_stderr\": 0.02944316932303154,\n \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.02944316932303154\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6974789915966386,\n \"acc_stderr\": 0.029837962388291932,\n \"acc_norm\": 0.6974789915966386,\n \"acc_norm_stderr\": 0.029837962388291932\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.0386155754625517,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.0386155754625517\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8440366972477065,\n \"acc_stderr\": 0.015555802713590167,\n \"acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.015555802713590167\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5138888888888888,\n \"acc_stderr\": 0.034086558679777494,\n \"acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.034086558679777494\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455335,\n \"acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455335\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.03498149385462472,\n \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.03498149385462472\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8181818181818182,\n \"acc_stderr\": 0.03520893951097653,\n \"acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.03520893951097653\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.03192193448934725,\n \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.03192193448934725\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8446601941747572,\n \"acc_stderr\": 0.035865947385739755,\n \"acc_norm\": 0.8446601941747572,\n \"acc_norm_stderr\": 0.035865947385739755\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n \"acc_stderr\": 0.023086635086841407,\n \"acc_norm\": 0.8547008547008547,\n \"acc_norm_stderr\": 0.023086635086841407\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8301404853128991,\n \"acc_stderr\": 0.013428186370608304,\n \"acc_norm\": 0.8301404853128991,\n \"acc_norm_stderr\": 0.013428186370608304\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7427745664739884,\n \"acc_stderr\": 0.023532925431044283,\n \"acc_norm\": 0.7427745664739884,\n \"acc_norm_stderr\": 0.023532925431044283\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3843575418994413,\n \"acc_stderr\": 0.016269088663959402,\n \"acc_norm\": 0.3843575418994413,\n \"acc_norm_stderr\": 0.016269088663959402\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292452,\n \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292452\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7234726688102894,\n \"acc_stderr\": 0.025403832978179604,\n \"acc_norm\": 0.7234726688102894,\n \"acc_norm_stderr\": 0.025403832978179604\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600712995,\n \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600712995\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4830508474576271,\n \"acc_stderr\": 0.012762896889210864,\n \"acc_norm\": 0.4830508474576271,\n \"acc_norm_stderr\": 0.012762896889210864\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7205882352941176,\n \"acc_stderr\": 0.027257202606114944,\n \"acc_norm\": 0.7205882352941176,\n \"acc_norm_stderr\": 0.027257202606114944\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6862745098039216,\n \"acc_stderr\": 0.01877168389352818,\n \"acc_norm\": 0.6862745098039216,\n \"acc_norm_stderr\": 0.01877168389352818\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8656716417910447,\n \"acc_stderr\": 0.024112678240900808,\n \"acc_norm\": 0.8656716417910447,\n \"acc_norm_stderr\": 0.024112678240900808\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.027539122889061463,\n \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.027539122889061463\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3733170134638923,\n \"mc1_stderr\": 0.01693237055757063,\n \"mc2\": 0.5505210495184077,\n \"mc2_stderr\": 0.015617590489404845\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7900552486187845,\n \"acc_stderr\": 0.01144628062926263\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7240333586050038,\n \"acc_stderr\": 0.012312603010427352\n }\n}\n```", "repo_url": "https://huggingface.co/chanwit/flux-7b-v0.1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_13T22_25_20.507875", "path": ["**/details_harness|arc:challenge|25_2024-01-13T22-25-20.507875.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-13T22-25-20.507875.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_13T22_25_20.507875", "path": ["**/details_harness|gsm8k|5_2024-01-13T22-25-20.507875.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-13T22-25-20.507875.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_13T22_25_20.507875", "path": ["**/details_harness|hellaswag|10_2024-01-13T22-25-20.507875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-13T22-25-20.507875.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_13T22_25_20.507875", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T22-25-20.507875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T22-25-20.507875.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T22-25-20.507875.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_13T22_25_20.507875", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T22-25-20.507875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T22-25-20.507875.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_13T22_25_20.507875", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T22-25-20.507875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T22-25-20.507875.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_13T22_25_20.507875", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T22-25-20.507875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T22-25-20.507875.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_13T22_25_20.507875", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T22-25-20.507875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T22-25-20.507875.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_13T22_25_20.507875", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T22-25-20.507875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T22-25-20.507875.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_13T22_25_20.507875", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T22-25-20.507875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T22-25-20.507875.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_13T22_25_20.507875", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T22-25-20.507875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T22-25-20.507875.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_13T22_25_20.507875", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T22-25-20.507875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T22-25-20.507875.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_13T22_25_20.507875", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T22-25-20.507875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T22-25-20.507875.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_13T22_25_20.507875", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T22-25-20.507875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T22-25-20.507875.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_13T22_25_20.507875", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T22-25-20.507875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T22-25-20.507875.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_13T22_25_20.507875", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T22-25-20.507875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T22-25-20.507875.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_13T22_25_20.507875", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T22-25-20.507875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T22-25-20.507875.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_13T22_25_20.507875", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T22-25-20.507875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T22-25-20.507875.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_13T22_25_20.507875", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T22-25-20.507875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T22-25-20.507875.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_13T22_25_20.507875", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T22-25-20.507875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T22-25-20.507875.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_13T22_25_20.507875", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T22-25-20.507875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T22-25-20.507875.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_13T22_25_20.507875", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T22-25-20.507875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T22-25-20.507875.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_13T22_25_20.507875", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T22-25-20.507875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T22-25-20.507875.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_13T22_25_20.507875", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T22-25-20.507875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T22-25-20.507875.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_13T22_25_20.507875", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T22-25-20.507875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T22-25-20.507875.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_13T22_25_20.507875", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T22-25-20.507875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T22-25-20.507875.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_13T22_25_20.507875", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T22-25-20.507875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T22-25-20.507875.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_13T22_25_20.507875", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T22-25-20.507875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T22-25-20.507875.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_13T22_25_20.507875", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T22-25-20.507875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T22-25-20.507875.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_13T22_25_20.507875", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T22-25-20.507875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T22-25-20.507875.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_13T22_25_20.507875", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T22-25-20.507875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T22-25-20.507875.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_13T22_25_20.507875", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T22-25-20.507875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T22-25-20.507875.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_13T22_25_20.507875", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T22-25-20.507875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T22-25-20.507875.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_13T22_25_20.507875", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T22-25-20.507875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T22-25-20.507875.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_13T22_25_20.507875", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T22-25-20.507875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T22-25-20.507875.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_13T22_25_20.507875", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T22-25-20.507875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T22-25-20.507875.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_13T22_25_20.507875", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T22-25-20.507875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T22-25-20.507875.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_13T22_25_20.507875", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T22-25-20.507875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T22-25-20.507875.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_13T22_25_20.507875", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T22-25-20.507875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T22-25-20.507875.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_13T22_25_20.507875", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T22-25-20.507875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T22-25-20.507875.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_13T22_25_20.507875", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T22-25-20.507875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T22-25-20.507875.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_13T22_25_20.507875", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T22-25-20.507875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T22-25-20.507875.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_13T22_25_20.507875", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T22-25-20.507875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T22-25-20.507875.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_13T22_25_20.507875", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T22-25-20.507875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T22-25-20.507875.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_13T22_25_20.507875", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T22-25-20.507875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T22-25-20.507875.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_13T22_25_20.507875", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T22-25-20.507875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T22-25-20.507875.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_13T22_25_20.507875", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T22-25-20.507875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T22-25-20.507875.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_13T22_25_20.507875", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T22-25-20.507875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T22-25-20.507875.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_13T22_25_20.507875", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T22-25-20.507875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T22-25-20.507875.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_13T22_25_20.507875", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T22-25-20.507875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T22-25-20.507875.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_13T22_25_20.507875", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T22-25-20.507875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T22-25-20.507875.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_13T22_25_20.507875", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T22-25-20.507875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T22-25-20.507875.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_13T22_25_20.507875", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T22-25-20.507875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T22-25-20.507875.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_13T22_25_20.507875", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T22-25-20.507875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T22-25-20.507875.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_13T22_25_20.507875", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T22-25-20.507875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T22-25-20.507875.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_13T22_25_20.507875", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T22-25-20.507875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T22-25-20.507875.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_13T22_25_20.507875", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T22-25-20.507875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T22-25-20.507875.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_13T22_25_20.507875", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T22-25-20.507875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T22-25-20.507875.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_13T22_25_20.507875", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T22-25-20.507875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T22-25-20.507875.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_13T22_25_20.507875", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T22-25-20.507875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T22-25-20.507875.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_13T22_25_20.507875", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T22-25-20.507875.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T22-25-20.507875.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_13T22_25_20.507875", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T22-25-20.507875.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T22-25-20.507875.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_13T22_25_20.507875", "path": ["**/details_harness|winogrande|5_2024-01-13T22-25-20.507875.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-13T22-25-20.507875.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_13T22_25_20.507875", "path": ["results_2024-01-13T22-25-20.507875.parquet"]}, {"split": "latest", "path": ["results_2024-01-13T22-25-20.507875.parquet"]}]}]} | 2024-01-13T22:28:09+00:00 |
da2148b2bc45e39ba3ec7a38650907f583bce241 |
# Dataset Card for Evaluation run of Jingyu6/MergeTest-7B-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Jingyu6/MergeTest-7B-slerp](https://huggingface.co/Jingyu6/MergeTest-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Jingyu6__MergeTest-7B-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T22:27:10.970794](https://huggingface.co/datasets/open-llm-leaderboard/details_Jingyu6__MergeTest-7B-slerp/blob/main/results_2024-01-13T22-27-10.970794.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6435586511149554,
"acc_stderr": 0.03211164826791609,
"acc_norm": 0.6437934877212986,
"acc_norm_stderr": 0.03276783411526557,
"mc1": 0.42962056303549573,
"mc1_stderr": 0.017329234580409098,
"mc2": 0.5979568100280714,
"mc2_stderr": 0.015157800976988994
},
"harness|arc:challenge|25": {
"acc": 0.6484641638225256,
"acc_stderr": 0.013952413699600938,
"acc_norm": 0.6774744027303754,
"acc_norm_stderr": 0.013659980894277366
},
"harness|hellaswag|10": {
"acc": 0.6698864767974507,
"acc_stderr": 0.004692926794268468,
"acc_norm": 0.8614817765385382,
"acc_norm_stderr": 0.003447370972192066
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.02863723563980089,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.02863723563980089
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.036430371689585475,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.036430371689585475
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.04724007352383888,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.04724007352383888
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5829787234042553,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.5829787234042553,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.047028804320496165,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.047028804320496165
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.025379524910778408,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.025379524910778408
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.044444444444444495,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.044444444444444495
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.02341529343356853,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.02341529343356853
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.02937661648494563,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.02937661648494563
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.02150024957603346,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.02150024957603346
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6487179487179487,
"acc_stderr": 0.024203665177902803,
"acc_norm": 0.6487179487179487,
"acc_norm_stderr": 0.024203665177902803
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.028578348365473082,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.028578348365473082
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.030066761582977927,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.030066761582977927
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8495412844036697,
"acc_stderr": 0.015328563932669237,
"acc_norm": 0.8495412844036697,
"acc_norm_stderr": 0.015328563932669237
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.027325470966716312,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.027325470966716312
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.025530100460233494,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.025530100460233494
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159465,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159465
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.047184714852195886,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.047184714852195886
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.022801382534597542,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.022801382534597542
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8352490421455939,
"acc_stderr": 0.013265346261323797,
"acc_norm": 0.8352490421455939,
"acc_norm_stderr": 0.013265346261323797
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7312138728323699,
"acc_stderr": 0.023868003262500104,
"acc_norm": 0.7312138728323699,
"acc_norm_stderr": 0.023868003262500104
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.36201117318435755,
"acc_stderr": 0.016073067350153087,
"acc_norm": 0.36201117318435755,
"acc_norm_stderr": 0.016073067350153087
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7418300653594772,
"acc_stderr": 0.02505850331695814,
"acc_norm": 0.7418300653594772,
"acc_norm_stderr": 0.02505850331695814
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7202572347266881,
"acc_stderr": 0.025494259350694912,
"acc_norm": 0.7202572347266881,
"acc_norm_stderr": 0.025494259350694912
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600712995,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600712995
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47131681877444587,
"acc_stderr": 0.012749206007657473,
"acc_norm": 0.47131681877444587,
"acc_norm_stderr": 0.012749206007657473
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.028245687391462927,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.028245687391462927
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6781045751633987,
"acc_stderr": 0.018901015322093092,
"acc_norm": 0.6781045751633987,
"acc_norm_stderr": 0.018901015322093092
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.746938775510204,
"acc_stderr": 0.027833023871399677,
"acc_norm": 0.746938775510204,
"acc_norm_stderr": 0.027833023871399677
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454115,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454115
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.42962056303549573,
"mc1_stderr": 0.017329234580409098,
"mc2": 0.5979568100280714,
"mc2_stderr": 0.015157800976988994
},
"harness|winogrande|5": {
"acc": 0.7963693764798737,
"acc_stderr": 0.011317798781626918
},
"harness|gsm8k|5": {
"acc": 0.6974981046247157,
"acc_stderr": 0.012652544133186141
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Jingyu6__MergeTest-7B-slerp | [
"region:us"
] | 2024-01-13T22:29:29+00:00 | {"pretty_name": "Evaluation run of Jingyu6/MergeTest-7B-slerp", "dataset_summary": "Dataset automatically created during the evaluation run of model [Jingyu6/MergeTest-7B-slerp](https://huggingface.co/Jingyu6/MergeTest-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Jingyu6__MergeTest-7B-slerp\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-13T22:27:10.970794](https://huggingface.co/datasets/open-llm-leaderboard/details_Jingyu6__MergeTest-7B-slerp/blob/main/results_2024-01-13T22-27-10.970794.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6435586511149554,\n \"acc_stderr\": 0.03211164826791609,\n \"acc_norm\": 0.6437934877212986,\n \"acc_norm_stderr\": 0.03276783411526557,\n \"mc1\": 0.42962056303549573,\n \"mc1_stderr\": 0.017329234580409098,\n \"mc2\": 0.5979568100280714,\n \"mc2_stderr\": 0.015157800976988994\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6484641638225256,\n \"acc_stderr\": 0.013952413699600938,\n \"acc_norm\": 0.6774744027303754,\n \"acc_norm_stderr\": 0.013659980894277366\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6698864767974507,\n \"acc_stderr\": 0.004692926794268468,\n \"acc_norm\": 0.8614817765385382,\n \"acc_norm_stderr\": 0.003447370972192066\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.02863723563980089,\n \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.02863723563980089\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.036430371689585475,\n \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.036430371689585475\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.04724007352383888,\n \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.04724007352383888\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.03223276266711712,\n \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.03223276266711712\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n \"acc_stderr\": 0.047028804320496165,\n \"acc_norm\": 0.5087719298245614,\n \"acc_norm_stderr\": 0.047028804320496165\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778408,\n \"acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778408\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.044444444444444495,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.044444444444444495\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n \"acc_stderr\": 0.02341529343356853,\n \"acc_norm\": 0.7838709677419354,\n \"acc_norm_stderr\": 0.02341529343356853\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7828282828282829,\n \"acc_stderr\": 0.02937661648494563,\n \"acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.02937661648494563\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.02150024957603346,\n \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.02150024957603346\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6487179487179487,\n \"acc_stderr\": 0.024203665177902803,\n \"acc_norm\": 0.6487179487179487,\n \"acc_norm_stderr\": 0.024203665177902803\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32592592592592595,\n \"acc_stderr\": 0.028578348365473082,\n \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.028578348365473082\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.030066761582977927,\n \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.030066761582977927\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8495412844036697,\n \"acc_stderr\": 0.015328563932669237,\n \"acc_norm\": 0.8495412844036697,\n \"acc_norm_stderr\": 0.015328563932669237\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8137254901960784,\n \"acc_stderr\": 0.027325470966716312,\n \"acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.027325470966716312\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.810126582278481,\n \"acc_stderr\": 0.025530100460233494,\n \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.025530100460233494\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159465,\n \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159465\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n \"acc_stderr\": 0.047184714852195886,\n \"acc_norm\": 0.44642857142857145,\n \"acc_norm_stderr\": 0.047184714852195886\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n \"acc_stderr\": 0.022801382534597542,\n \"acc_norm\": 0.8589743589743589,\n \"acc_norm_stderr\": 0.022801382534597542\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8352490421455939,\n \"acc_stderr\": 0.013265346261323797,\n \"acc_norm\": 0.8352490421455939,\n \"acc_norm_stderr\": 0.013265346261323797\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7312138728323699,\n \"acc_stderr\": 0.023868003262500104,\n \"acc_norm\": 0.7312138728323699,\n \"acc_norm_stderr\": 0.023868003262500104\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.36201117318435755,\n \"acc_stderr\": 0.016073067350153087,\n \"acc_norm\": 0.36201117318435755,\n \"acc_norm_stderr\": 0.016073067350153087\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7418300653594772,\n \"acc_stderr\": 0.02505850331695814,\n \"acc_norm\": 0.7418300653594772,\n \"acc_norm_stderr\": 0.02505850331695814\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n \"acc_stderr\": 0.025494259350694912,\n \"acc_norm\": 0.7202572347266881,\n \"acc_norm_stderr\": 0.025494259350694912\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600712995,\n \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600712995\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47131681877444587,\n \"acc_stderr\": 0.012749206007657473,\n \"acc_norm\": 0.47131681877444587,\n \"acc_norm_stderr\": 0.012749206007657473\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462927,\n \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462927\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6781045751633987,\n \"acc_stderr\": 0.018901015322093092,\n \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.018901015322093092\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.746938775510204,\n \"acc_stderr\": 0.027833023871399677,\n \"acc_norm\": 0.746938775510204,\n \"acc_norm_stderr\": 0.027833023871399677\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454115,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454115\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.42962056303549573,\n \"mc1_stderr\": 0.017329234580409098,\n \"mc2\": 0.5979568100280714,\n \"mc2_stderr\": 0.015157800976988994\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7963693764798737,\n \"acc_stderr\": 0.011317798781626918\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6974981046247157,\n \"acc_stderr\": 0.012652544133186141\n }\n}\n```", "repo_url": "https://huggingface.co/Jingyu6/MergeTest-7B-slerp", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_13T22_27_10.970794", "path": ["**/details_harness|arc:challenge|25_2024-01-13T22-27-10.970794.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-13T22-27-10.970794.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_13T22_27_10.970794", "path": ["**/details_harness|gsm8k|5_2024-01-13T22-27-10.970794.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-13T22-27-10.970794.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_13T22_27_10.970794", "path": ["**/details_harness|hellaswag|10_2024-01-13T22-27-10.970794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-13T22-27-10.970794.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_13T22_27_10.970794", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T22-27-10.970794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T22-27-10.970794.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T22-27-10.970794.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_13T22_27_10.970794", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T22-27-10.970794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T22-27-10.970794.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_13T22_27_10.970794", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T22-27-10.970794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T22-27-10.970794.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_13T22_27_10.970794", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T22-27-10.970794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T22-27-10.970794.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_13T22_27_10.970794", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T22-27-10.970794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T22-27-10.970794.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_13T22_27_10.970794", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T22-27-10.970794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T22-27-10.970794.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_13T22_27_10.970794", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T22-27-10.970794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T22-27-10.970794.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_13T22_27_10.970794", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T22-27-10.970794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T22-27-10.970794.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_13T22_27_10.970794", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T22-27-10.970794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T22-27-10.970794.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_13T22_27_10.970794", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T22-27-10.970794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T22-27-10.970794.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_13T22_27_10.970794", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T22-27-10.970794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T22-27-10.970794.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_13T22_27_10.970794", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T22-27-10.970794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T22-27-10.970794.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_13T22_27_10.970794", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T22-27-10.970794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T22-27-10.970794.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_13T22_27_10.970794", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T22-27-10.970794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T22-27-10.970794.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_13T22_27_10.970794", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T22-27-10.970794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T22-27-10.970794.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_13T22_27_10.970794", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T22-27-10.970794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T22-27-10.970794.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_13T22_27_10.970794", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T22-27-10.970794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T22-27-10.970794.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_13T22_27_10.970794", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T22-27-10.970794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T22-27-10.970794.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_13T22_27_10.970794", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T22-27-10.970794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T22-27-10.970794.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_13T22_27_10.970794", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T22-27-10.970794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T22-27-10.970794.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_13T22_27_10.970794", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T22-27-10.970794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T22-27-10.970794.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_13T22_27_10.970794", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T22-27-10.970794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T22-27-10.970794.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_13T22_27_10.970794", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T22-27-10.970794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T22-27-10.970794.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_13T22_27_10.970794", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T22-27-10.970794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T22-27-10.970794.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_13T22_27_10.970794", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T22-27-10.970794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T22-27-10.970794.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_13T22_27_10.970794", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T22-27-10.970794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T22-27-10.970794.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_13T22_27_10.970794", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T22-27-10.970794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T22-27-10.970794.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_13T22_27_10.970794", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T22-27-10.970794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T22-27-10.970794.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_13T22_27_10.970794", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T22-27-10.970794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T22-27-10.970794.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_13T22_27_10.970794", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T22-27-10.970794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T22-27-10.970794.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_13T22_27_10.970794", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T22-27-10.970794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T22-27-10.970794.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_13T22_27_10.970794", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T22-27-10.970794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T22-27-10.970794.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_13T22_27_10.970794", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T22-27-10.970794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T22-27-10.970794.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_13T22_27_10.970794", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T22-27-10.970794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T22-27-10.970794.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_13T22_27_10.970794", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T22-27-10.970794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T22-27-10.970794.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_13T22_27_10.970794", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T22-27-10.970794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T22-27-10.970794.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_13T22_27_10.970794", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T22-27-10.970794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T22-27-10.970794.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_13T22_27_10.970794", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T22-27-10.970794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T22-27-10.970794.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_13T22_27_10.970794", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T22-27-10.970794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T22-27-10.970794.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_13T22_27_10.970794", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T22-27-10.970794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T22-27-10.970794.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_13T22_27_10.970794", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T22-27-10.970794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T22-27-10.970794.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_13T22_27_10.970794", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T22-27-10.970794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T22-27-10.970794.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_13T22_27_10.970794", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T22-27-10.970794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T22-27-10.970794.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_13T22_27_10.970794", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T22-27-10.970794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T22-27-10.970794.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_13T22_27_10.970794", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T22-27-10.970794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T22-27-10.970794.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_13T22_27_10.970794", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T22-27-10.970794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T22-27-10.970794.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_13T22_27_10.970794", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T22-27-10.970794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T22-27-10.970794.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_13T22_27_10.970794", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T22-27-10.970794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T22-27-10.970794.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_13T22_27_10.970794", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T22-27-10.970794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T22-27-10.970794.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_13T22_27_10.970794", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T22-27-10.970794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T22-27-10.970794.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_13T22_27_10.970794", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T22-27-10.970794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T22-27-10.970794.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_13T22_27_10.970794", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T22-27-10.970794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T22-27-10.970794.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_13T22_27_10.970794", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T22-27-10.970794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T22-27-10.970794.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_13T22_27_10.970794", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T22-27-10.970794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T22-27-10.970794.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_13T22_27_10.970794", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T22-27-10.970794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T22-27-10.970794.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_13T22_27_10.970794", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T22-27-10.970794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T22-27-10.970794.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_13T22_27_10.970794", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T22-27-10.970794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T22-27-10.970794.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_13T22_27_10.970794", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T22-27-10.970794.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T22-27-10.970794.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_13T22_27_10.970794", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T22-27-10.970794.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T22-27-10.970794.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_13T22_27_10.970794", "path": ["**/details_harness|winogrande|5_2024-01-13T22-27-10.970794.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-13T22-27-10.970794.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_13T22_27_10.970794", "path": ["results_2024-01-13T22-27-10.970794.parquet"]}, {"split": "latest", "path": ["results_2024-01-13T22-27-10.970794.parquet"]}]}]} | 2024-01-13T22:29:48+00:00 |
9c1da005953cd90d4948effb703ff4bb965a84a3 |
# Dataset Card for Evaluation run of CallComply/openchat-3.5-0106-32k
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [CallComply/openchat-3.5-0106-32k](https://huggingface.co/CallComply/openchat-3.5-0106-32k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CallComply__openchat-3.5-0106-32k",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T22:31:22.930720](https://huggingface.co/datasets/open-llm-leaderboard/details_CallComply__openchat-3.5-0106-32k/blob/main/results_2024-01-13T22-31-22.930720.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6528578653707416,
"acc_stderr": 0.031849870154313474,
"acc_norm": 0.6535559561419437,
"acc_norm_stderr": 0.03250454817189663,
"mc1": 0.35862913096695226,
"mc1_stderr": 0.016789289499502022,
"mc2": 0.5189602568049447,
"mc2_stderr": 0.015303685990455876
},
"harness|arc:challenge|25": {
"acc": 0.621160409556314,
"acc_stderr": 0.014175915490000324,
"acc_norm": 0.6604095563139932,
"acc_norm_stderr": 0.01383903976282017
},
"harness|hellaswag|10": {
"acc": 0.6338378809002191,
"acc_stderr": 0.0048076995399734075,
"acc_norm": 0.8293168691495718,
"acc_norm_stderr": 0.0037546293132751625
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.02794321998933714,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.02794321998933714
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145634,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145634
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.03533133389323657,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.03533133389323657
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.04755129616062947,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.04755129616062947
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5829787234042553,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.5829787234042553,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.04692008381368909,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.04692008381368909
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.593103448275862,
"acc_stderr": 0.04093793981266236,
"acc_norm": 0.593103448275862,
"acc_norm_stderr": 0.04093793981266236
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.02546714904546955,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.02546714904546955
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5079365079365079,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.5079365079365079,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.02315787934908353,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.02315787934908353
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.02937661648494562,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.02937661648494562
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768763,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768763
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.023946724741563973,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.023946724741563973
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3592592592592593,
"acc_stderr": 0.029252905927251972,
"acc_norm": 0.3592592592592593,
"acc_norm_stderr": 0.029252905927251972
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.030176808288974337,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.030176808288974337
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8495412844036697,
"acc_stderr": 0.015328563932669237,
"acc_norm": 0.8495412844036697,
"acc_norm_stderr": 0.015328563932669237
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49537037037037035,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.49537037037037035,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8284313725490197,
"acc_stderr": 0.026460569561240644,
"acc_norm": 0.8284313725490197,
"acc_norm_stderr": 0.026460569561240644
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8185654008438819,
"acc_stderr": 0.025085961144579647,
"acc_norm": 0.8185654008438819,
"acc_norm_stderr": 0.025085961144579647
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7130044843049327,
"acc_stderr": 0.030360379710291943,
"acc_norm": 0.7130044843049327,
"acc_norm_stderr": 0.030360379710291943
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.03760178006026621,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.03760178006026621
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8931623931623932,
"acc_stderr": 0.02023714900899093,
"acc_norm": 0.8931623931623932,
"acc_norm_stderr": 0.02023714900899093
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8365261813537676,
"acc_stderr": 0.013223928616741626,
"acc_norm": 0.8365261813537676,
"acc_norm_stderr": 0.013223928616741626
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7601156069364162,
"acc_stderr": 0.022989592543123563,
"acc_norm": 0.7601156069364162,
"acc_norm_stderr": 0.022989592543123563
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.761437908496732,
"acc_stderr": 0.02440439492808787,
"acc_norm": 0.761437908496732,
"acc_norm_stderr": 0.02440439492808787
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7623456790123457,
"acc_stderr": 0.023683591837008557,
"acc_norm": 0.7623456790123457,
"acc_norm_stderr": 0.023683591837008557
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.029820747191422473,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.029820747191422473
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4869621903520209,
"acc_stderr": 0.012765893883835332,
"acc_norm": 0.4869621903520209,
"acc_norm_stderr": 0.012765893883835332
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.02679956202488766,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.02679956202488766
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6748366013071896,
"acc_stderr": 0.01895088677080631,
"acc_norm": 0.6748366013071896,
"acc_norm_stderr": 0.01895088677080631
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.746938775510204,
"acc_stderr": 0.027833023871399673,
"acc_norm": 0.746938775510204,
"acc_norm_stderr": 0.027833023871399673
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578334,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578334
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197768,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197768
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.02917088550072767,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.02917088550072767
},
"harness|truthfulqa:mc|0": {
"mc1": 0.35862913096695226,
"mc1_stderr": 0.016789289499502022,
"mc2": 0.5189602568049447,
"mc2_stderr": 0.015303685990455876
},
"harness|winogrande|5": {
"acc": 0.8176795580110497,
"acc_stderr": 0.010851565594267195
},
"harness|gsm8k|5": {
"acc": 0.6815769522365428,
"acc_stderr": 0.01283222572307541
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_CallComply__openchat-3.5-0106-32k | [
"region:us"
] | 2024-01-13T22:33:41+00:00 | {"pretty_name": "Evaluation run of CallComply/openchat-3.5-0106-32k", "dataset_summary": "Dataset automatically created during the evaluation run of model [CallComply/openchat-3.5-0106-32k](https://huggingface.co/CallComply/openchat-3.5-0106-32k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CallComply__openchat-3.5-0106-32k\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-13T22:31:22.930720](https://huggingface.co/datasets/open-llm-leaderboard/details_CallComply__openchat-3.5-0106-32k/blob/main/results_2024-01-13T22-31-22.930720.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6528578653707416,\n \"acc_stderr\": 0.031849870154313474,\n \"acc_norm\": 0.6535559561419437,\n \"acc_norm_stderr\": 0.03250454817189663,\n \"mc1\": 0.35862913096695226,\n \"mc1_stderr\": 0.016789289499502022,\n \"mc2\": 0.5189602568049447,\n \"mc2_stderr\": 0.015303685990455876\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.621160409556314,\n \"acc_stderr\": 0.014175915490000324,\n \"acc_norm\": 0.6604095563139932,\n \"acc_norm_stderr\": 0.01383903976282017\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6338378809002191,\n \"acc_stderr\": 0.0048076995399734075,\n \"acc_norm\": 0.8293168691495718,\n \"acc_norm_stderr\": 0.0037546293132751625\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.02794321998933714,\n \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.02794321998933714\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145634,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145634\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6878612716763006,\n \"acc_stderr\": 0.03533133389323657,\n \"acc_norm\": 0.6878612716763006,\n \"acc_norm_stderr\": 0.03533133389323657\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.04755129616062947,\n \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.04755129616062947\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.03223276266711712,\n \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.03223276266711712\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.04692008381368909,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.04692008381368909\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.593103448275862,\n \"acc_stderr\": 0.04093793981266236,\n \"acc_norm\": 0.593103448275862,\n \"acc_norm_stderr\": 0.04093793981266236\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42592592592592593,\n \"acc_stderr\": 0.02546714904546955,\n \"acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.02546714904546955\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5079365079365079,\n \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.5079365079365079,\n \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n \"acc_stderr\": 0.02315787934908353,\n \"acc_norm\": 0.7903225806451613,\n \"acc_norm_stderr\": 0.02315787934908353\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7828282828282829,\n \"acc_stderr\": 0.02937661648494562,\n \"acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.02937661648494562\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768763,\n \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768763\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563973,\n \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563973\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3592592592592593,\n \"acc_stderr\": 0.029252905927251972,\n \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.029252905927251972\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8495412844036697,\n \"acc_stderr\": 0.015328563932669237,\n \"acc_norm\": 0.8495412844036697,\n \"acc_norm_stderr\": 0.015328563932669237\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8284313725490197,\n \"acc_stderr\": 0.026460569561240644,\n \"acc_norm\": 0.8284313725490197,\n \"acc_norm_stderr\": 0.026460569561240644\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8185654008438819,\n \"acc_stderr\": 0.025085961144579647,\n \"acc_norm\": 0.8185654008438819,\n \"acc_norm_stderr\": 0.025085961144579647\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7130044843049327,\n \"acc_stderr\": 0.030360379710291943,\n \"acc_norm\": 0.7130044843049327,\n \"acc_norm_stderr\": 0.030360379710291943\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.03760178006026621,\n \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.03760178006026621\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8931623931623932,\n \"acc_stderr\": 0.02023714900899093,\n \"acc_norm\": 0.8931623931623932,\n \"acc_norm_stderr\": 0.02023714900899093\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8365261813537676,\n \"acc_stderr\": 0.013223928616741626,\n \"acc_norm\": 0.8365261813537676,\n \"acc_norm_stderr\": 0.013223928616741626\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7601156069364162,\n \"acc_stderr\": 0.022989592543123563,\n \"acc_norm\": 0.7601156069364162,\n \"acc_norm_stderr\": 0.022989592543123563\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.761437908496732,\n \"acc_stderr\": 0.02440439492808787,\n \"acc_norm\": 0.761437908496732,\n \"acc_norm_stderr\": 0.02440439492808787\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7623456790123457,\n \"acc_stderr\": 0.023683591837008557,\n \"acc_norm\": 0.7623456790123457,\n \"acc_norm_stderr\": 0.023683591837008557\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422473,\n \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422473\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4869621903520209,\n \"acc_stderr\": 0.012765893883835332,\n \"acc_norm\": 0.4869621903520209,\n \"acc_norm_stderr\": 0.012765893883835332\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.02679956202488766,\n \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.02679956202488766\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6748366013071896,\n \"acc_stderr\": 0.01895088677080631,\n \"acc_norm\": 0.6748366013071896,\n \"acc_norm_stderr\": 0.01895088677080631\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.746938775510204,\n \"acc_stderr\": 0.027833023871399673,\n \"acc_norm\": 0.746938775510204,\n \"acc_norm_stderr\": 0.027833023871399673\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n \"acc_stderr\": 0.025538433368578334,\n \"acc_norm\": 0.845771144278607,\n \"acc_norm_stderr\": 0.025538433368578334\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197768,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197768\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.02917088550072767,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.02917088550072767\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.35862913096695226,\n \"mc1_stderr\": 0.016789289499502022,\n \"mc2\": 0.5189602568049447,\n \"mc2_stderr\": 0.015303685990455876\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8176795580110497,\n \"acc_stderr\": 0.010851565594267195\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6815769522365428,\n \"acc_stderr\": 0.01283222572307541\n }\n}\n```", "repo_url": "https://huggingface.co/CallComply/openchat-3.5-0106-32k", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_13T22_31_22.930720", "path": ["**/details_harness|arc:challenge|25_2024-01-13T22-31-22.930720.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-13T22-31-22.930720.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_13T22_31_22.930720", "path": ["**/details_harness|gsm8k|5_2024-01-13T22-31-22.930720.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-13T22-31-22.930720.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_13T22_31_22.930720", "path": ["**/details_harness|hellaswag|10_2024-01-13T22-31-22.930720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-13T22-31-22.930720.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_13T22_31_22.930720", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T22-31-22.930720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T22-31-22.930720.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T22-31-22.930720.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_13T22_31_22.930720", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T22-31-22.930720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T22-31-22.930720.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_13T22_31_22.930720", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T22-31-22.930720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T22-31-22.930720.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_13T22_31_22.930720", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T22-31-22.930720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T22-31-22.930720.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_13T22_31_22.930720", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T22-31-22.930720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T22-31-22.930720.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_13T22_31_22.930720", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T22-31-22.930720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T22-31-22.930720.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_13T22_31_22.930720", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T22-31-22.930720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T22-31-22.930720.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_13T22_31_22.930720", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T22-31-22.930720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T22-31-22.930720.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_13T22_31_22.930720", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T22-31-22.930720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T22-31-22.930720.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_13T22_31_22.930720", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T22-31-22.930720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T22-31-22.930720.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_13T22_31_22.930720", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T22-31-22.930720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T22-31-22.930720.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_13T22_31_22.930720", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T22-31-22.930720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T22-31-22.930720.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_13T22_31_22.930720", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T22-31-22.930720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T22-31-22.930720.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_13T22_31_22.930720", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T22-31-22.930720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T22-31-22.930720.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_13T22_31_22.930720", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T22-31-22.930720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T22-31-22.930720.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_13T22_31_22.930720", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T22-31-22.930720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T22-31-22.930720.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_13T22_31_22.930720", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T22-31-22.930720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T22-31-22.930720.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_13T22_31_22.930720", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T22-31-22.930720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T22-31-22.930720.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_13T22_31_22.930720", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T22-31-22.930720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T22-31-22.930720.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_13T22_31_22.930720", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T22-31-22.930720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T22-31-22.930720.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_13T22_31_22.930720", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T22-31-22.930720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T22-31-22.930720.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_13T22_31_22.930720", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T22-31-22.930720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T22-31-22.930720.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_13T22_31_22.930720", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T22-31-22.930720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T22-31-22.930720.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_13T22_31_22.930720", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T22-31-22.930720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T22-31-22.930720.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_13T22_31_22.930720", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T22-31-22.930720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T22-31-22.930720.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_13T22_31_22.930720", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T22-31-22.930720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T22-31-22.930720.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_13T22_31_22.930720", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T22-31-22.930720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T22-31-22.930720.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_13T22_31_22.930720", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T22-31-22.930720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T22-31-22.930720.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_13T22_31_22.930720", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T22-31-22.930720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T22-31-22.930720.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_13T22_31_22.930720", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T22-31-22.930720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T22-31-22.930720.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_13T22_31_22.930720", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T22-31-22.930720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T22-31-22.930720.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_13T22_31_22.930720", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T22-31-22.930720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T22-31-22.930720.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_13T22_31_22.930720", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T22-31-22.930720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T22-31-22.930720.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_13T22_31_22.930720", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T22-31-22.930720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T22-31-22.930720.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_13T22_31_22.930720", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T22-31-22.930720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T22-31-22.930720.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_13T22_31_22.930720", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T22-31-22.930720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T22-31-22.930720.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_13T22_31_22.930720", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T22-31-22.930720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T22-31-22.930720.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_13T22_31_22.930720", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T22-31-22.930720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T22-31-22.930720.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_13T22_31_22.930720", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T22-31-22.930720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T22-31-22.930720.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_13T22_31_22.930720", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T22-31-22.930720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T22-31-22.930720.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_13T22_31_22.930720", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T22-31-22.930720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T22-31-22.930720.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_13T22_31_22.930720", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T22-31-22.930720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T22-31-22.930720.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_13T22_31_22.930720", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T22-31-22.930720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T22-31-22.930720.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_13T22_31_22.930720", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T22-31-22.930720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T22-31-22.930720.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_13T22_31_22.930720", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T22-31-22.930720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T22-31-22.930720.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_13T22_31_22.930720", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T22-31-22.930720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T22-31-22.930720.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_13T22_31_22.930720", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T22-31-22.930720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T22-31-22.930720.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_13T22_31_22.930720", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T22-31-22.930720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T22-31-22.930720.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_13T22_31_22.930720", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T22-31-22.930720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T22-31-22.930720.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_13T22_31_22.930720", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T22-31-22.930720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T22-31-22.930720.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_13T22_31_22.930720", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T22-31-22.930720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T22-31-22.930720.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_13T22_31_22.930720", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T22-31-22.930720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T22-31-22.930720.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_13T22_31_22.930720", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T22-31-22.930720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T22-31-22.930720.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_13T22_31_22.930720", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T22-31-22.930720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T22-31-22.930720.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_13T22_31_22.930720", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T22-31-22.930720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T22-31-22.930720.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_13T22_31_22.930720", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T22-31-22.930720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T22-31-22.930720.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_13T22_31_22.930720", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T22-31-22.930720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T22-31-22.930720.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_13T22_31_22.930720", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T22-31-22.930720.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T22-31-22.930720.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_13T22_31_22.930720", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T22-31-22.930720.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T22-31-22.930720.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_13T22_31_22.930720", "path": ["**/details_harness|winogrande|5_2024-01-13T22-31-22.930720.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-13T22-31-22.930720.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_13T22_31_22.930720", "path": ["results_2024-01-13T22-31-22.930720.parquet"]}, {"split": "latest", "path": ["results_2024-01-13T22-31-22.930720.parquet"]}]}]} | 2024-01-13T22:34:00+00:00 |
7f2c358d64dc2fb37184af0dc41cd94a07fd0422 | Kikikmdsa/kvoicemodel | [
"license:openrail",
"region:us"
] | 2024-01-13T22:34:49+00:00 | {"license": "openrail"} | 2024-01-13T22:35:16+00:00 |
|
5156d3bc0ab166e0008d41a55d1928b2bddda476 | MatsuoDochiai/Took | [
"license:openrail",
"region:us"
] | 2024-01-13T22:39:06+00:00 | {"license": "openrail"} | 2024-01-13T22:40:13+00:00 |
|
bd113a6312ec16a730cf8fdfaf54e65ac92ecb26 |
# Dataset of spitfire/Spitfire/喷火 (Girls' Frontline)
This is the dataset of spitfire/Spitfire/喷火 (Girls' Frontline), containing 15 images and their tags.
The core tags of this character are `long_hair, hat, green_eyes, top_hat, grey_hair, breasts, bangs, hair_between_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 15 | 20.18 MiB | [Download](https://huggingface.co/datasets/CyberHarem/spitfire_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 15 | 11.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/spitfire_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 34 | 22.67 MiB | [Download](https://huggingface.co/datasets/CyberHarem/spitfire_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 15 | 18.00 MiB | [Download](https://huggingface.co/datasets/CyberHarem/spitfire_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 34 | 30.82 MiB | [Download](https://huggingface.co/datasets/CyberHarem/spitfire_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/spitfire_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 12 |  |  |  |  |  | 1girl, looking_at_viewer, solo, black_gloves, dress, belt, handgun, necktie, bare_shoulders, boots, brown_hair, holding_gun, official_alternate_costume, pantyhose, small_breasts, thighhighs |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | black_gloves | dress | belt | handgun | necktie | bare_shoulders | boots | brown_hair | holding_gun | official_alternate_costume | pantyhose | small_breasts | thighhighs |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:---------------|:--------|:-------|:----------|:----------|:-----------------|:--------|:-------------|:--------------|:-----------------------------|:------------|:----------------|:-------------|
| 0 | 12 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/spitfire_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T22:42:35+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T22:45:55+00:00 |
1e2c49579c410741416495c106f07c136e37c738 |
# Dataset of aug_para/AUGPara/AUGSMG (Girls' Frontline)
This is the dataset of aug_para/AUGPara/AUGSMG (Girls' Frontline), containing 19 images and their tags.
The core tags of this character are `long_hair, yellow_eyes, bangs, twintails, grey_hair, bow, hair_ribbon, breasts, hair_bow, ribbon`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 19 | 28.23 MiB | [Download](https://huggingface.co/datasets/CyberHarem/aug_para_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 19 | 17.16 MiB | [Download](https://huggingface.co/datasets/CyberHarem/aug_para_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 46 | 34.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/aug_para_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 19 | 25.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/aug_para_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 46 | 47.10 MiB | [Download](https://huggingface.co/datasets/CyberHarem/aug_para_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/aug_para_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 13 |  |  |  |  |  | 1girl, solo, blush, looking_at_viewer, dress, open_mouth, smile, holding, simple_background, white_background, bag, black_ribbon, long_sleeves, open_clothes, squirrel, thighhighs |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | blush | looking_at_viewer | dress | open_mouth | smile | holding | simple_background | white_background | bag | black_ribbon | long_sleeves | open_clothes | squirrel | thighhighs |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:--------------------|:--------|:-------------|:--------|:----------|:--------------------|:-------------------|:------|:---------------|:---------------|:---------------|:-----------|:-------------|
| 0 | 13 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/aug_para_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T22:42:51+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T22:47:26+00:00 |
ff9bb3cdd1523a60005c828d932407d51b2f79f5 |
# Dataset of shimanto/四万十/四万十 (Azur Lane)
This is the dataset of shimanto/四万十/四万十 (Azur Lane), containing 40 images and their tags.
The core tags of this character are `breasts, long_hair, large_breasts, red_eyes, white_hair, bangs, horns, very_long_hair, dragon_girl`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 40 | 75.34 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shimanto_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 40 | 35.66 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shimanto_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 108 | 81.61 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shimanto_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 40 | 62.42 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shimanto_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 108 | 123.93 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shimanto_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/shimanto_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 28 |  |  |  |  |  | 1girl, solo, cleavage, looking_at_viewer, detached_sleeves, white_thighhighs, bare_shoulders, thighs, sash, blush, navel, panties, tail, white_background, parted_lips, revealing_clothes |
| 1 | 6 |  |  |  |  |  | bare_shoulders, detached_collar, looking_at_viewer, maid_headdress, white_gloves, 1girl, cleavage, detached_sleeves, solo, black_bowtie, smile, wide_sleeves, black_thighhighs, blush, frills, indoors, waist_apron, white_apron |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | cleavage | looking_at_viewer | detached_sleeves | white_thighhighs | bare_shoulders | thighs | sash | blush | navel | panties | tail | white_background | parted_lips | revealing_clothes | detached_collar | maid_headdress | white_gloves | black_bowtie | smile | wide_sleeves | black_thighhighs | frills | indoors | waist_apron | white_apron |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:-----------|:--------------------|:-------------------|:-------------------|:-----------------|:---------|:-------|:--------|:--------|:----------|:-------|:-------------------|:--------------|:--------------------|:------------------|:-----------------|:---------------|:---------------|:--------|:---------------|:-------------------|:---------|:----------|:--------------|:--------------|
| 0 | 28 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | X | X | X | | X | | | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/shimanto_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T22:43:56+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T22:54:36+00:00 |
f1717ef6c29c85e677aeaf65fc79a58852354ee6 |
# Dataset of guichen/ギシャン/吉尚 (Azur Lane)
This is the dataset of guichen/ギシャン/吉尚 (Azur Lane), containing 22 images and their tags.
The core tags of this character are `long_hair, breasts, hat, large_breasts, white_headwear, witch_hat, earrings, bangs, purple_eyes, very_long_hair, blue_eyes, mole, white_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 22 | 38.86 MiB | [Download](https://huggingface.co/datasets/CyberHarem/guichen_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 22 | 18.84 MiB | [Download](https://huggingface.co/datasets/CyberHarem/guichen_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 52 | 39.55 MiB | [Download](https://huggingface.co/datasets/CyberHarem/guichen_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 22 | 32.81 MiB | [Download](https://huggingface.co/datasets/CyberHarem/guichen_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 52 | 62.81 MiB | [Download](https://huggingface.co/datasets/CyberHarem/guichen_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/guichen_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 22 |  |  |  |  |  | 1girl, bare_shoulders, solo, jewelry, looking_at_viewer, detached_sleeves, smile, white_dress, white_thighhighs, black_panties, crescent, thighs, navel, see-through, blush, witch |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | solo | jewelry | looking_at_viewer | detached_sleeves | smile | white_dress | white_thighhighs | black_panties | crescent | thighs | navel | see-through | blush | witch |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:-------|:----------|:--------------------|:-------------------|:--------|:--------------|:-------------------|:----------------|:-----------|:---------|:--------|:--------------|:--------|:--------|
| 0 | 22 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/guichen_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T22:44:06+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T22:49:32+00:00 |
d0868835f986803bc2e54dccb5d40a457f68810e |
# Dataset of allen_m_sumner/アレン・M・サムナー/艾伦·萨姆纳 (Azur Lane)
This is the dataset of allen_m_sumner/アレン・M・サムナー/艾伦·萨姆纳 (Azur Lane), containing 41 images and their tags.
The core tags of this character are `breasts, long_hair, red_eyes, black_hair, bangs, hair_between_eyes, twintails, hair_ornament, medium_breasts, very_long_hair, bow, large_breasts, animal_ears, blue_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 41 | 67.13 MiB | [Download](https://huggingface.co/datasets/CyberHarem/allen_m_sumner_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 41 | 35.12 MiB | [Download](https://huggingface.co/datasets/CyberHarem/allen_m_sumner_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 104 | 76.03 MiB | [Download](https://huggingface.co/datasets/CyberHarem/allen_m_sumner_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 41 | 57.56 MiB | [Download](https://huggingface.co/datasets/CyberHarem/allen_m_sumner_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 104 | 113.14 MiB | [Download](https://huggingface.co/datasets/CyberHarem/allen_m_sumner_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/allen_m_sumner_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 13 |  |  |  |  |  | 1girl, bare_shoulders, double_bun, off_shoulder, official_alternate_costume, playboy_bunny, rabbit_ears, solo, black_jacket, black_leotard, long_sleeves, looking_at_viewer, open_jacket, fake_animal_ears, smile, hair_bow, underboob_cutout, braided_bun, brown_pantyhose, sitting, ass, tongue_out, bodystocking, closed_mouth, simple_background, sleeves_past_wrists, black_footwear, blush, shoes, white_background |
| 1 | 18 |  |  |  |  |  | looking_at_viewer, underboob_cutout, 1girl, solo, bare_shoulders, two-tone_leotard, off_shoulder, open_coat, black_leotard, open_mouth, skindentation, black_coat, blush, groin, long_sleeves, thigh_strap, badge, cowboy_shot, frilled_leotard, standing, sidelocks, :d, armpits, ass_visible_through_thighs, white_leotard |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | double_bun | off_shoulder | official_alternate_costume | playboy_bunny | rabbit_ears | solo | black_jacket | black_leotard | long_sleeves | looking_at_viewer | open_jacket | fake_animal_ears | smile | hair_bow | underboob_cutout | braided_bun | brown_pantyhose | sitting | ass | tongue_out | bodystocking | closed_mouth | simple_background | sleeves_past_wrists | black_footwear | blush | shoes | white_background | two-tone_leotard | open_coat | open_mouth | skindentation | black_coat | groin | thigh_strap | badge | cowboy_shot | frilled_leotard | standing | sidelocks | :d | armpits | ass_visible_through_thighs | white_leotard |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:-------------|:---------------|:-----------------------------|:----------------|:--------------|:-------|:---------------|:----------------|:---------------|:--------------------|:--------------|:-------------------|:--------|:-----------|:-------------------|:--------------|:------------------|:----------|:------|:-------------|:---------------|:---------------|:--------------------|:----------------------|:-----------------|:--------|:--------|:-------------------|:-------------------|:------------|:-------------|:----------------|:-------------|:--------|:--------------|:--------|:--------------|:------------------|:-----------|:------------|:-----|:----------|:-----------------------------|:----------------|
| 0 | 13 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 1 | 18 |  |  |  |  |  | X | X | | X | | | | X | | X | X | X | | | | | X | | | | | | | | | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/allen_m_sumner_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T22:44:12+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T22:56:02+00:00 |
80e0179ebd1ccca27072d245dcd748c81400774e |
# Dataset of chiyoda/千代田/千代田 (Azur Lane)
This is the dataset of chiyoda/千代田/千代田 (Azur Lane), containing 31 images and their tags.
The core tags of this character are `breasts, red_hair, animal_ears, large_breasts, long_hair, purple_eyes, bangs, fox_ears, animal_ear_fluff, hair_ornament, hair_flower`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 31 | 62.54 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chiyoda_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 31 | 35.10 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chiyoda_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 81 | 73.65 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chiyoda_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 31 | 55.06 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chiyoda_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 81 | 111.27 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chiyoda_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/chiyoda_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 14 |  |  |  |  |  | looking_at_viewer, 1girl, red_bikini, flower, solo, smile, blush, cleavage, collar, navel, red_eyes, side-tie_bikini_bottom, string_bikini, choker, day, bare_shoulders, hair_between_eyes, open_mouth, outdoors, sky |
| 1 | 9 |  |  |  |  |  | 1girl, fox_mask, looking_at_viewer, solo, wide_sleeves, cleavage, mask_on_head, white_thighhighs, detached_sleeves, armpits, red_skirt, tongue_out, full_body, kimono, sash |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | looking_at_viewer | 1girl | red_bikini | flower | solo | smile | blush | cleavage | collar | navel | red_eyes | side-tie_bikini_bottom | string_bikini | choker | day | bare_shoulders | hair_between_eyes | open_mouth | outdoors | sky | fox_mask | wide_sleeves | mask_on_head | white_thighhighs | detached_sleeves | armpits | red_skirt | tongue_out | full_body | kimono | sash |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------|:--------|:-------------|:---------|:-------|:--------|:--------|:-----------|:---------|:--------|:-----------|:-------------------------|:----------------|:---------|:------|:-----------------|:--------------------|:-------------|:-----------|:------|:-----------|:---------------|:---------------|:-------------------|:-------------------|:----------|:------------|:-------------|:------------|:---------|:-------|
| 0 | 14 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | |
| 1 | 9 |  |  |  |  |  | X | X | | | X | | | X | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/chiyoda_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T22:44:12+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T22:54:32+00:00 |
86e6d0f7afbdac0c118edd98dc00f8f839b0073c |
# Dataset of an_shan/鞍山/鞍山 (Azur Lane)
This is the dataset of an_shan/鞍山/鞍山 (Azur Lane), containing 25 images and their tags.
The core tags of this character are `green_eyes, long_hair, green_hair, ponytail, hair_ornament, bangs, hairclip, braid, very_long_hair, breasts, hat, horns`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 25 | 28.23 MiB | [Download](https://huggingface.co/datasets/CyberHarem/an_shan_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 25 | 17.49 MiB | [Download](https://huggingface.co/datasets/CyberHarem/an_shan_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 56 | 35.68 MiB | [Download](https://huggingface.co/datasets/CyberHarem/an_shan_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 25 | 25.59 MiB | [Download](https://huggingface.co/datasets/CyberHarem/an_shan_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 56 | 49.14 MiB | [Download](https://huggingface.co/datasets/CyberHarem/an_shan_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/an_shan_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------|
| 0 | 25 |  |  |  |  |  | 1girl, blush, looking_at_viewer, solo, fingerless_gloves, epaulettes, black_gloves, long_sleeves, uniform |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blush | looking_at_viewer | solo | fingerless_gloves | epaulettes | black_gloves | long_sleeves | uniform |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:--------------------|:-------|:--------------------|:-------------|:---------------|:---------------|:----------|
| 0 | 25 |  |  |  |  |  | X | X | X | X | X | X | X | X | X |
| CyberHarem/an_shan_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T22:44:14+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T22:50:41+00:00 |
48e4b13db97c686c88adef324e3d1b8070212bd5 |
# Dataset of forbin/フォルバン/福尔班 (Azur Lane)
This is the dataset of forbin/フォルバン/福尔班 (Azur Lane), containing 36 images and their tags.
The core tags of this character are `blonde_hair, long_hair, breasts, green_eyes, braid, large_breasts, bangs, bow, hair_ornament, hair_bun, ribbon, single_hair_bun`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 36 | 45.24 MiB | [Download](https://huggingface.co/datasets/CyberHarem/forbin_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 36 | 27.08 MiB | [Download](https://huggingface.co/datasets/CyberHarem/forbin_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 84 | 55.68 MiB | [Download](https://huggingface.co/datasets/CyberHarem/forbin_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 36 | 40.90 MiB | [Download](https://huggingface.co/datasets/CyberHarem/forbin_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 84 | 76.60 MiB | [Download](https://huggingface.co/datasets/CyberHarem/forbin_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/forbin_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 23 |  |  |  |  |  | 1girl, blush, looking_at_viewer, solo, white_dress, bare_shoulders, cleavage, white_gloves, collarbone, elbow_gloves, fingerless_gloves, hair_bow, holding, open_mouth, white_bow, hair_between_eyes, parted_lips |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blush | looking_at_viewer | solo | white_dress | bare_shoulders | cleavage | white_gloves | collarbone | elbow_gloves | fingerless_gloves | hair_bow | holding | open_mouth | white_bow | hair_between_eyes | parted_lips |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:--------------------|:-------|:--------------|:-----------------|:-----------|:---------------|:-------------|:---------------|:--------------------|:-----------|:----------|:-------------|:------------|:--------------------|:--------------|
| 0 | 23 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/forbin_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T22:44:16+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T22:53:13+00:00 |
b7af18f9954d53404eb6a75c2298202d077d9bc7 | alphalm/gt1_8kElo_all_tokenized | [
"license:apache-2.0",
"region:us"
] | 2024-01-13T22:44:50+00:00 | {"license": "apache-2.0"} | 2024-01-14T01:33:47+00:00 |
|
eea8eb53c93d7bb3e49855cdf27007dbbb4b337a | bhargavi909/medical | [
"region:us"
] | 2024-01-13T22:47:59+00:00 | {} | 2024-01-13T22:48:16+00:00 |
|
e98bd565f11ee405be251d09d536f9578a4c9252 |
This is a dataset of translation variants generated for `load_dataset("facebook/flores", "eng_Latn-ukr_Cyrl")["dev"]` using [mistralai/Mistral-7B-v0.1](https://docs.mistral.ai/self-deployment/vllm/).
Data was generated using the following script:
```python
import sys
import requests
import json
context = """[INST] They are planning to host a party next weekend. [/INST] Вони планують провести вечірку наступного вікенду.
[INST] I enjoy swimming in the ocean and feeling the salty breeze. [/INST] Мені подобається плавати в океані та відчувати солоний вітер.
[INST]"""
def prompt(input, url="http://localhost:8000/v1/completions"):
data = {
"prompt": f"{context} {input} [/INST]",
"stop": "[INST]",
"max_tokens": 512,
"temperature": 0,
#"temperature": 1.0,
#"top_p": 0.001,
#"top_k": 40,
"model": "mistralai/Mistral-7B-v0.1",
"presence_penalty": 0.1,
"use_beam_search": True,
"n": 25,
"logprobs": 1,
}
headers = {
"Content-Type": "application/json"
}
response = requests.post(url, headers=headers, data=json.dumps(data))
result = response.json()
return result
for line in sys.stdin:
text = prompt(line.strip())
print(json.dumps(text, ensure_ascii=False))
```
Quickly run vllm locally using:
```
docker run --gpus all -p 8000:8000 -e HF_HOME=/hf -e CUDA_VISIBLE_DEVICES=0 -v ~/.cache/huggingface:/hf \
ghcr.io/mistralai/mistral-src/vllm:latest --host 0.0.0.0 --model mistralai/Mistral-7B-v0.1
``` | darkproger/flores-uk-beams | [
"task_categories:translation",
"size_categories:n<1K",
"language:uk",
"language:en",
"license:mit",
"region:us"
] | 2024-01-13T22:48:23+00:00 | {"language": ["uk", "en"], "license": "mit", "size_categories": ["n<1K"], "task_categories": ["translation"]} | 2024-01-13T23:51:31+00:00 |
228a2cbad7e7d8961f98e258eef90686a778628d |
# Dataset Card for Evaluation run of SanjiWatsuki/Lelantos-DPO-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [SanjiWatsuki/Lelantos-DPO-7B](https://huggingface.co/SanjiWatsuki/Lelantos-DPO-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_SanjiWatsuki__Lelantos-DPO-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T22:46:02.001551](https://huggingface.co/datasets/open-llm-leaderboard/details_SanjiWatsuki__Lelantos-DPO-7B/blob/main/results_2024-01-13T22-46-02.001551.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6448646713155067,
"acc_stderr": 0.03228864323452271,
"acc_norm": 0.6451783471312221,
"acc_norm_stderr": 0.03294673289821137,
"mc1": 0.5067319461444308,
"mc1_stderr": 0.017501914492655396,
"mc2": 0.6777342992399603,
"mc2_stderr": 0.01515297850307826
},
"harness|arc:challenge|25": {
"acc": 0.6732081911262798,
"acc_stderr": 0.013706665975587333,
"acc_norm": 0.7107508532423208,
"acc_norm_stderr": 0.013250012579393443
},
"harness|hellaswag|10": {
"acc": 0.6960764787890859,
"acc_stderr": 0.004590100050198808,
"acc_norm": 0.8722366062537343,
"acc_norm_stderr": 0.0033314391934060423
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7171052631578947,
"acc_stderr": 0.03665349695640767,
"acc_norm": 0.7171052631578947,
"acc_norm_stderr": 0.03665349695640767
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6792452830188679,
"acc_stderr": 0.028727502957880274,
"acc_norm": 0.6792452830188679,
"acc_norm_stderr": 0.028727502957880274
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7430555555555556,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.7430555555555556,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.543859649122807,
"acc_stderr": 0.04685473041907789,
"acc_norm": 0.543859649122807,
"acc_norm_stderr": 0.04685473041907789
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.02548718714785938,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.02548718714785938
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677172,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.02354079935872329,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.02354079935872329
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.03514528562175008,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.03514528562175008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.033744026441394036,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.033744026441394036
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.029857515673386417,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.029857515673386417
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6307692307692307,
"acc_stderr": 0.02446861524147892,
"acc_norm": 0.6307692307692307,
"acc_norm_stderr": 0.02446861524147892
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.028037929969115,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.028037929969115
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.030176808288974337,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.030176808288974337
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658752,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658752
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8330275229357799,
"acc_stderr": 0.015990154885073403,
"acc_norm": 0.8330275229357799,
"acc_norm_stderr": 0.015990154885073403
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8186274509803921,
"acc_stderr": 0.027044621719474082,
"acc_norm": 0.8186274509803921,
"acc_norm_stderr": 0.027044621719474082
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8143459915611815,
"acc_stderr": 0.025310495376944856,
"acc_norm": 0.8143459915611815,
"acc_norm_stderr": 0.025310495376944856
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.031381476375754995,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.031381476375754995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7914110429447853,
"acc_stderr": 0.031921934489347235,
"acc_norm": 0.7914110429447853,
"acc_norm_stderr": 0.031921934489347235
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165612,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165612
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8263090676883781,
"acc_stderr": 0.01354741565866226,
"acc_norm": 0.8263090676883781,
"acc_norm_stderr": 0.01354741565866226
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7052023121387283,
"acc_stderr": 0.024547617794803828,
"acc_norm": 0.7052023121387283,
"acc_norm_stderr": 0.024547617794803828
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4011173184357542,
"acc_stderr": 0.01639222189940707,
"acc_norm": 0.4011173184357542,
"acc_norm_stderr": 0.01639222189940707
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.025646863097137897,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.025646863097137897
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7202572347266881,
"acc_stderr": 0.025494259350694912,
"acc_norm": 0.7202572347266881,
"acc_norm_stderr": 0.025494259350694912
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.024659685185967284,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.024659685185967284
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5070921985815603,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.5070921985815603,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4784876140808344,
"acc_stderr": 0.012758410941038911,
"acc_norm": 0.4784876140808344,
"acc_norm_stderr": 0.012758410941038911
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.027678468642144717,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.027678468642144717
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.018975427920507208,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.018975427920507208
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.02812342933514278,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.02812342933514278
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8109452736318408,
"acc_stderr": 0.02768691358801301,
"acc_norm": 0.8109452736318408,
"acc_norm_stderr": 0.02768691358801301
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5067319461444308,
"mc1_stderr": 0.017501914492655396,
"mc2": 0.6777342992399603,
"mc2_stderr": 0.01515297850307826
},
"harness|winogrande|5": {
"acc": 0.8003157063930545,
"acc_stderr": 0.01123532838262585
},
"harness|gsm8k|5": {
"acc": 0.6846095526914329,
"acc_stderr": 0.01279935367580183
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_SanjiWatsuki__Lelantos-DPO-7B | [
"region:us"
] | 2024-01-13T22:48:23+00:00 | {"pretty_name": "Evaluation run of SanjiWatsuki/Lelantos-DPO-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [SanjiWatsuki/Lelantos-DPO-7B](https://huggingface.co/SanjiWatsuki/Lelantos-DPO-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_SanjiWatsuki__Lelantos-DPO-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-13T22:46:02.001551](https://huggingface.co/datasets/open-llm-leaderboard/details_SanjiWatsuki__Lelantos-DPO-7B/blob/main/results_2024-01-13T22-46-02.001551.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6448646713155067,\n \"acc_stderr\": 0.03228864323452271,\n \"acc_norm\": 0.6451783471312221,\n \"acc_norm_stderr\": 0.03294673289821137,\n \"mc1\": 0.5067319461444308,\n \"mc1_stderr\": 0.017501914492655396,\n \"mc2\": 0.6777342992399603,\n \"mc2_stderr\": 0.01515297850307826\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6732081911262798,\n \"acc_stderr\": 0.013706665975587333,\n \"acc_norm\": 0.7107508532423208,\n \"acc_norm_stderr\": 0.013250012579393443\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6960764787890859,\n \"acc_stderr\": 0.004590100050198808,\n \"acc_norm\": 0.8722366062537343,\n \"acc_norm_stderr\": 0.0033314391934060423\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7171052631578947,\n \"acc_stderr\": 0.03665349695640767,\n \"acc_norm\": 0.7171052631578947,\n \"acc_norm_stderr\": 0.03665349695640767\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.028727502957880274,\n \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.028727502957880274\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.543859649122807,\n \"acc_stderr\": 0.04685473041907789,\n \"acc_norm\": 0.543859649122807,\n \"acc_norm_stderr\": 0.04685473041907789\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.02548718714785938,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.02548718714785938\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n \"acc_stderr\": 0.02354079935872329,\n \"acc_norm\": 0.7806451612903226,\n \"acc_norm_stderr\": 0.02354079935872329\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175008,\n \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175008\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.033744026441394036,\n \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.033744026441394036\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7727272727272727,\n \"acc_stderr\": 0.029857515673386417,\n \"acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.029857515673386417\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6307692307692307,\n \"acc_stderr\": 0.02446861524147892,\n \"acc_norm\": 0.6307692307692307,\n \"acc_norm_stderr\": 0.02446861524147892\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3037037037037037,\n \"acc_stderr\": 0.028037929969115,\n \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.028037929969115\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658752,\n \"acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658752\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8330275229357799,\n \"acc_stderr\": 0.015990154885073403,\n \"acc_norm\": 0.8330275229357799,\n \"acc_norm_stderr\": 0.015990154885073403\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8186274509803921,\n \"acc_stderr\": 0.027044621719474082,\n \"acc_norm\": 0.8186274509803921,\n \"acc_norm_stderr\": 0.027044621719474082\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944856,\n \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944856\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n \"acc_stderr\": 0.031381476375754995,\n \"acc_norm\": 0.6771300448430493,\n \"acc_norm_stderr\": 0.031381476375754995\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.031921934489347235,\n \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.031921934489347235\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n \"acc_stderr\": 0.022209309073165612,\n \"acc_norm\": 0.8675213675213675,\n \"acc_norm_stderr\": 0.022209309073165612\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n \"acc_stderr\": 0.01354741565866226,\n \"acc_norm\": 0.8263090676883781,\n \"acc_norm_stderr\": 0.01354741565866226\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7052023121387283,\n \"acc_stderr\": 0.024547617794803828,\n \"acc_norm\": 0.7052023121387283,\n \"acc_norm_stderr\": 0.024547617794803828\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4011173184357542,\n \"acc_stderr\": 0.01639222189940707,\n \"acc_norm\": 0.4011173184357542,\n \"acc_norm_stderr\": 0.01639222189940707\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.025646863097137897,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.025646863097137897\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n \"acc_stderr\": 0.025494259350694912,\n \"acc_norm\": 0.7202572347266881,\n \"acc_norm_stderr\": 0.025494259350694912\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.024659685185967284,\n \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.024659685185967284\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5070921985815603,\n \"acc_stderr\": 0.02982449855912901,\n \"acc_norm\": 0.5070921985815603,\n \"acc_norm_stderr\": 0.02982449855912901\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4784876140808344,\n \"acc_stderr\": 0.012758410941038911,\n \"acc_norm\": 0.4784876140808344,\n \"acc_norm_stderr\": 0.012758410941038911\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.027678468642144717,\n \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.027678468642144717\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.673202614379085,\n \"acc_stderr\": 0.018975427920507208,\n \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.018975427920507208\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.02812342933514278,\n \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.02812342933514278\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8109452736318408,\n \"acc_stderr\": 0.02768691358801301,\n \"acc_norm\": 0.8109452736318408,\n \"acc_norm_stderr\": 0.02768691358801301\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5067319461444308,\n \"mc1_stderr\": 0.017501914492655396,\n \"mc2\": 0.6777342992399603,\n \"mc2_stderr\": 0.01515297850307826\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8003157063930545,\n \"acc_stderr\": 0.01123532838262585\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6846095526914329,\n \"acc_stderr\": 0.01279935367580183\n }\n}\n```", "repo_url": "https://huggingface.co/SanjiWatsuki/Lelantos-DPO-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_13T22_46_02.001551", "path": ["**/details_harness|arc:challenge|25_2024-01-13T22-46-02.001551.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-13T22-46-02.001551.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_13T22_46_02.001551", "path": ["**/details_harness|gsm8k|5_2024-01-13T22-46-02.001551.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-13T22-46-02.001551.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_13T22_46_02.001551", "path": ["**/details_harness|hellaswag|10_2024-01-13T22-46-02.001551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-13T22-46-02.001551.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_13T22_46_02.001551", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T22-46-02.001551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T22-46-02.001551.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T22-46-02.001551.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_13T22_46_02.001551", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T22-46-02.001551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T22-46-02.001551.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_13T22_46_02.001551", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T22-46-02.001551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T22-46-02.001551.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_13T22_46_02.001551", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T22-46-02.001551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T22-46-02.001551.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_13T22_46_02.001551", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T22-46-02.001551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T22-46-02.001551.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_13T22_46_02.001551", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T22-46-02.001551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T22-46-02.001551.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_13T22_46_02.001551", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T22-46-02.001551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T22-46-02.001551.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_13T22_46_02.001551", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T22-46-02.001551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T22-46-02.001551.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_13T22_46_02.001551", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T22-46-02.001551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T22-46-02.001551.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_13T22_46_02.001551", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T22-46-02.001551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T22-46-02.001551.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_13T22_46_02.001551", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T22-46-02.001551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T22-46-02.001551.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_13T22_46_02.001551", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T22-46-02.001551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T22-46-02.001551.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_13T22_46_02.001551", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T22-46-02.001551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T22-46-02.001551.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_13T22_46_02.001551", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T22-46-02.001551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T22-46-02.001551.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_13T22_46_02.001551", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T22-46-02.001551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T22-46-02.001551.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_13T22_46_02.001551", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T22-46-02.001551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T22-46-02.001551.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_13T22_46_02.001551", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T22-46-02.001551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T22-46-02.001551.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_13T22_46_02.001551", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T22-46-02.001551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T22-46-02.001551.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_13T22_46_02.001551", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T22-46-02.001551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T22-46-02.001551.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_13T22_46_02.001551", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T22-46-02.001551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T22-46-02.001551.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_13T22_46_02.001551", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T22-46-02.001551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T22-46-02.001551.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_13T22_46_02.001551", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T22-46-02.001551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T22-46-02.001551.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_13T22_46_02.001551", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T22-46-02.001551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T22-46-02.001551.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_13T22_46_02.001551", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T22-46-02.001551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T22-46-02.001551.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_13T22_46_02.001551", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T22-46-02.001551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T22-46-02.001551.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_13T22_46_02.001551", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T22-46-02.001551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T22-46-02.001551.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_13T22_46_02.001551", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T22-46-02.001551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T22-46-02.001551.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_13T22_46_02.001551", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T22-46-02.001551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T22-46-02.001551.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_13T22_46_02.001551", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T22-46-02.001551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T22-46-02.001551.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_13T22_46_02.001551", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T22-46-02.001551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T22-46-02.001551.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_13T22_46_02.001551", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T22-46-02.001551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T22-46-02.001551.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_13T22_46_02.001551", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T22-46-02.001551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T22-46-02.001551.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_13T22_46_02.001551", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T22-46-02.001551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T22-46-02.001551.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_13T22_46_02.001551", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T22-46-02.001551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T22-46-02.001551.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_13T22_46_02.001551", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T22-46-02.001551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T22-46-02.001551.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_13T22_46_02.001551", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T22-46-02.001551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T22-46-02.001551.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_13T22_46_02.001551", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T22-46-02.001551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T22-46-02.001551.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_13T22_46_02.001551", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T22-46-02.001551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T22-46-02.001551.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_13T22_46_02.001551", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T22-46-02.001551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T22-46-02.001551.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_13T22_46_02.001551", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T22-46-02.001551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T22-46-02.001551.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_13T22_46_02.001551", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T22-46-02.001551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T22-46-02.001551.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_13T22_46_02.001551", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T22-46-02.001551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T22-46-02.001551.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_13T22_46_02.001551", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T22-46-02.001551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T22-46-02.001551.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_13T22_46_02.001551", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T22-46-02.001551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T22-46-02.001551.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_13T22_46_02.001551", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T22-46-02.001551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T22-46-02.001551.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_13T22_46_02.001551", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T22-46-02.001551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T22-46-02.001551.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_13T22_46_02.001551", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T22-46-02.001551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T22-46-02.001551.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_13T22_46_02.001551", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T22-46-02.001551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T22-46-02.001551.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_13T22_46_02.001551", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T22-46-02.001551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T22-46-02.001551.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_13T22_46_02.001551", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T22-46-02.001551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T22-46-02.001551.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_13T22_46_02.001551", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T22-46-02.001551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T22-46-02.001551.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_13T22_46_02.001551", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T22-46-02.001551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T22-46-02.001551.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_13T22_46_02.001551", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T22-46-02.001551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T22-46-02.001551.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_13T22_46_02.001551", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T22-46-02.001551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T22-46-02.001551.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_13T22_46_02.001551", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T22-46-02.001551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T22-46-02.001551.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_13T22_46_02.001551", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T22-46-02.001551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T22-46-02.001551.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_13T22_46_02.001551", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T22-46-02.001551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T22-46-02.001551.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_13T22_46_02.001551", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T22-46-02.001551.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T22-46-02.001551.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_13T22_46_02.001551", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T22-46-02.001551.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T22-46-02.001551.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_13T22_46_02.001551", "path": ["**/details_harness|winogrande|5_2024-01-13T22-46-02.001551.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-13T22-46-02.001551.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_13T22_46_02.001551", "path": ["results_2024-01-13T22-46-02.001551.parquet"]}, {"split": "latest", "path": ["results_2024-01-13T22-46-02.001551.parquet"]}]}]} | 2024-01-13T22:48:44+00:00 |
df37a0b1c38609f0fa7d96fadc93e6280617e145 | bhargavi909/covid19 | [
"region:us"
] | 2024-01-13T22:53:02+00:00 | {} | 2024-01-13T22:53:35+00:00 |
|
d4cb2f4ea65fe35a8c1b91569e02c5463ebc9332 | kslice/embtut | [
"region:us"
] | 2024-01-13T22:54:41+00:00 | {} | 2024-01-13T22:56:20+00:00 |
|
dc1c98271d35a48fcf490888548fe18588672f0e | bhargavi909/covid_final | [
"region:us"
] | 2024-01-13T22:59:41+00:00 | {} | 2024-01-13T23:00:03+00:00 |
|
f6196610392e1bddf349ddf64f391374be370708 | UnderstandLing/oasst1_bn | [
"license:apache-2.0",
"region:us"
] | 2024-01-13T22:59:59+00:00 | {"license": "apache-2.0", "dataset_info": {"features": [{"name": "message_id", "dtype": "string"}, {"name": "parent_id", "dtype": "string"}, {"name": "user_id", "dtype": "string"}, {"name": "created_date", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "role", "dtype": "string"}, {"name": "lang", "dtype": "string"}, {"name": "review_count", "dtype": "int64"}, {"name": "review_result", "dtype": "bool"}, {"name": "deleted", "dtype": "bool"}, {"name": "rank", "dtype": "float64"}, {"name": "synthetic", "dtype": "bool"}, {"name": "model_name", "dtype": "null"}, {"name": "detoxify", "struct": [{"name": "identity_attack", "dtype": "float64"}, {"name": "insult", "dtype": "float64"}, {"name": "obscene", "dtype": "float64"}, {"name": "severe_toxicity", "dtype": "float64"}, {"name": "sexual_explicit", "dtype": "float64"}, {"name": "threat", "dtype": "float64"}, {"name": "toxicity", "dtype": "float64"}]}, {"name": "message_tree_id", "dtype": "string"}, {"name": "tree_state", "dtype": "string"}, {"name": "emojis", "struct": [{"name": "count", "sequence": "int64"}, {"name": "name", "sequence": "string"}]}, {"name": "labels", "struct": [{"name": "count", "sequence": "int64"}, {"name": "name", "sequence": "string"}, {"name": "value", "sequence": "float64"}]}], "splits": [{"name": "train", "num_bytes": 117761745, "num_examples": 83078}, {"name": "validation", "num_bytes": 5686930, "num_examples": 3946}], "download_size": 31092337, "dataset_size": 123448675}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}]} | 2024-01-13T23:00:48+00:00 |
|
9cf0db6fcbc68d7535b4e461a95f969f10777f55 |
# Dataset Card for Evaluation run of vicgalle/SOLAR-13B-Instruct-v1.0
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [vicgalle/SOLAR-13B-Instruct-v1.0](https://huggingface.co/vicgalle/SOLAR-13B-Instruct-v1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_vicgalle__SOLAR-13B-Instruct-v1.0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T23:03:16.622437](https://huggingface.co/datasets/open-llm-leaderboard/details_vicgalle__SOLAR-13B-Instruct-v1.0/blob/main/results_2024-01-13T23-03-16.622437.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5538159165724174,
"acc_stderr": 0.03403197325352318,
"acc_norm": 0.5615645038041155,
"acc_norm_stderr": 0.03477929396757003,
"mc1": 0.44920440636474906,
"mc1_stderr": 0.01741294198611531,
"mc2": 0.619920564120794,
"mc2_stderr": 0.01593484036504592
},
"harness|arc:challenge|25": {
"acc": 0.5435153583617748,
"acc_stderr": 0.01455594976049644,
"acc_norm": 0.5725255972696246,
"acc_norm_stderr": 0.014456862944650647
},
"harness|hellaswag|10": {
"acc": 0.5913164708225453,
"acc_stderr": 0.004905859114942291,
"acc_norm": 0.7803226448914559,
"acc_norm_stderr": 0.004131818797713876
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952365,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952365
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4962962962962963,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.4962962962962963,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6052631578947368,
"acc_stderr": 0.039777499346220734,
"acc_norm": 0.6052631578947368,
"acc_norm_stderr": 0.039777499346220734
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6037735849056604,
"acc_stderr": 0.030102793781791194,
"acc_norm": 0.6037735849056604,
"acc_norm_stderr": 0.030102793781791194
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5694444444444444,
"acc_stderr": 0.04140685639111502,
"acc_norm": 0.5694444444444444,
"acc_norm_stderr": 0.04140685639111502
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5722543352601156,
"acc_stderr": 0.03772446857518026,
"acc_norm": 0.5722543352601156,
"acc_norm_stderr": 0.03772446857518026
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4808510638297872,
"acc_stderr": 0.03266204299064678,
"acc_norm": 0.4808510638297872,
"acc_norm_stderr": 0.03266204299064678
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3508771929824561,
"acc_stderr": 0.044895393502706986,
"acc_norm": 0.3508771929824561,
"acc_norm_stderr": 0.044895393502706986
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878151,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878151
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.36243386243386244,
"acc_stderr": 0.024757473902752042,
"acc_norm": 0.36243386243386244,
"acc_norm_stderr": 0.024757473902752042
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3492063492063492,
"acc_stderr": 0.04263906892795132,
"acc_norm": 0.3492063492063492,
"acc_norm_stderr": 0.04263906892795132
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6387096774193548,
"acc_stderr": 0.027327548447957546,
"acc_norm": 0.6387096774193548,
"acc_norm_stderr": 0.027327548447957546
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.41379310344827586,
"acc_stderr": 0.034653044884067945,
"acc_norm": 0.41379310344827586,
"acc_norm_stderr": 0.034653044884067945
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.03453131801885415,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.03453131801885415
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6767676767676768,
"acc_stderr": 0.03332299921070644,
"acc_norm": 0.6767676767676768,
"acc_norm_stderr": 0.03332299921070644
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.772020725388601,
"acc_stderr": 0.03027690994517826,
"acc_norm": 0.772020725388601,
"acc_norm_stderr": 0.03027690994517826
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5025641025641026,
"acc_stderr": 0.025350672979412188,
"acc_norm": 0.5025641025641026,
"acc_norm_stderr": 0.025350672979412188
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2740740740740741,
"acc_stderr": 0.027195934804085622,
"acc_norm": 0.2740740740740741,
"acc_norm_stderr": 0.027195934804085622
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5294117647058824,
"acc_stderr": 0.03242225027115006,
"acc_norm": 0.5294117647058824,
"acc_norm_stderr": 0.03242225027115006
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7504587155963303,
"acc_stderr": 0.01855389762950163,
"acc_norm": 0.7504587155963303,
"acc_norm_stderr": 0.01855389762950163
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4351851851851852,
"acc_stderr": 0.03381200005643524,
"acc_norm": 0.4351851851851852,
"acc_norm_stderr": 0.03381200005643524
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7303921568627451,
"acc_stderr": 0.031145570659486782,
"acc_norm": 0.7303921568627451,
"acc_norm_stderr": 0.031145570659486782
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7341772151898734,
"acc_stderr": 0.02875679962965834,
"acc_norm": 0.7341772151898734,
"acc_norm_stderr": 0.02875679962965834
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6233183856502242,
"acc_stderr": 0.03252113489929188,
"acc_norm": 0.6233183856502242,
"acc_norm_stderr": 0.03252113489929188
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6030534351145038,
"acc_stderr": 0.04291135671009225,
"acc_norm": 0.6030534351145038,
"acc_norm_stderr": 0.04291135671009225
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.71900826446281,
"acc_stderr": 0.04103203830514512,
"acc_norm": 0.71900826446281,
"acc_norm_stderr": 0.04103203830514512
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6018518518518519,
"acc_stderr": 0.04732332615978813,
"acc_norm": 0.6018518518518519,
"acc_norm_stderr": 0.04732332615978813
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6134969325153374,
"acc_stderr": 0.03825825548848607,
"acc_norm": 0.6134969325153374,
"acc_norm_stderr": 0.03825825548848607
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.027236013946196697,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.027236013946196697
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7484035759897829,
"acc_stderr": 0.01551732236552963,
"acc_norm": 0.7484035759897829,
"acc_norm_stderr": 0.01551732236552963
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5895953757225434,
"acc_stderr": 0.026483392042098174,
"acc_norm": 0.5895953757225434,
"acc_norm_stderr": 0.026483392042098174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2782122905027933,
"acc_stderr": 0.014987325439963551,
"acc_norm": 0.2782122905027933,
"acc_norm_stderr": 0.014987325439963551
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5849673202614379,
"acc_stderr": 0.028213504177824103,
"acc_norm": 0.5849673202614379,
"acc_norm_stderr": 0.028213504177824103
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6109324758842444,
"acc_stderr": 0.027690337536485372,
"acc_norm": 0.6109324758842444,
"acc_norm_stderr": 0.027690337536485372
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6635802469135802,
"acc_stderr": 0.026289734945952922,
"acc_norm": 0.6635802469135802,
"acc_norm_stderr": 0.026289734945952922
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.41843971631205673,
"acc_stderr": 0.029427994039419998,
"acc_norm": 0.41843971631205673,
"acc_norm_stderr": 0.029427994039419998
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4172099087353325,
"acc_stderr": 0.012593959992906429,
"acc_norm": 0.4172099087353325,
"acc_norm_stderr": 0.012593959992906429
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5808823529411765,
"acc_stderr": 0.029972807170464622,
"acc_norm": 0.5808823529411765,
"acc_norm_stderr": 0.029972807170464622
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5424836601307189,
"acc_stderr": 0.02015468571259089,
"acc_norm": 0.5424836601307189,
"acc_norm_stderr": 0.02015468571259089
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5909090909090909,
"acc_stderr": 0.04709306978661896,
"acc_norm": 0.5909090909090909,
"acc_norm_stderr": 0.04709306978661896
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5102040816326531,
"acc_stderr": 0.03200255347893783,
"acc_norm": 0.5102040816326531,
"acc_norm_stderr": 0.03200255347893783
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7114427860696517,
"acc_stderr": 0.03203841040213321,
"acc_norm": 0.7114427860696517,
"acc_norm_stderr": 0.03203841040213321
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-virology|5": {
"acc": 0.45180722891566266,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.45180722891566266,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7426900584795322,
"acc_stderr": 0.03352799844161865,
"acc_norm": 0.7426900584795322,
"acc_norm_stderr": 0.03352799844161865
},
"harness|truthfulqa:mc|0": {
"mc1": 0.44920440636474906,
"mc1_stderr": 0.01741294198611531,
"mc2": 0.619920564120794,
"mc2_stderr": 0.01593484036504592
},
"harness|winogrande|5": {
"acc": 0.7024467245461721,
"acc_stderr": 0.012849085254614654
},
"harness|gsm8k|5": {
"acc": 0.16603487490523122,
"acc_stderr": 0.01024981199059352
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_vicgalle__SOLAR-13B-Instruct-v1.0 | [
"region:us"
] | 2024-01-13T23:05:33+00:00 | {"pretty_name": "Evaluation run of vicgalle/SOLAR-13B-Instruct-v1.0", "dataset_summary": "Dataset automatically created during the evaluation run of model [vicgalle/SOLAR-13B-Instruct-v1.0](https://huggingface.co/vicgalle/SOLAR-13B-Instruct-v1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_vicgalle__SOLAR-13B-Instruct-v1.0\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-13T23:03:16.622437](https://huggingface.co/datasets/open-llm-leaderboard/details_vicgalle__SOLAR-13B-Instruct-v1.0/blob/main/results_2024-01-13T23-03-16.622437.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5538159165724174,\n \"acc_stderr\": 0.03403197325352318,\n \"acc_norm\": 0.5615645038041155,\n \"acc_norm_stderr\": 0.03477929396757003,\n \"mc1\": 0.44920440636474906,\n \"mc1_stderr\": 0.01741294198611531,\n \"mc2\": 0.619920564120794,\n \"mc2_stderr\": 0.01593484036504592\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5435153583617748,\n \"acc_stderr\": 0.01455594976049644,\n \"acc_norm\": 0.5725255972696246,\n \"acc_norm_stderr\": 0.014456862944650647\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5913164708225453,\n \"acc_stderr\": 0.004905859114942291,\n \"acc_norm\": 0.7803226448914559,\n \"acc_norm_stderr\": 0.004131818797713876\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4962962962962963,\n \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.4962962962962963,\n \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6052631578947368,\n \"acc_stderr\": 0.039777499346220734,\n \"acc_norm\": 0.6052631578947368,\n \"acc_norm_stderr\": 0.039777499346220734\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6037735849056604,\n \"acc_stderr\": 0.030102793781791194,\n \"acc_norm\": 0.6037735849056604,\n \"acc_norm_stderr\": 0.030102793781791194\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5694444444444444,\n \"acc_stderr\": 0.04140685639111502,\n \"acc_norm\": 0.5694444444444444,\n \"acc_norm_stderr\": 0.04140685639111502\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5722543352601156,\n \"acc_stderr\": 0.03772446857518026,\n \"acc_norm\": 0.5722543352601156,\n \"acc_norm_stderr\": 0.03772446857518026\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4808510638297872,\n \"acc_stderr\": 0.03266204299064678,\n \"acc_norm\": 0.4808510638297872,\n \"acc_norm_stderr\": 0.03266204299064678\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3508771929824561,\n \"acc_stderr\": 0.044895393502706986,\n \"acc_norm\": 0.3508771929824561,\n \"acc_norm_stderr\": 0.044895393502706986\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878151,\n \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878151\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.36243386243386244,\n \"acc_stderr\": 0.024757473902752042,\n \"acc_norm\": 0.36243386243386244,\n \"acc_norm_stderr\": 0.024757473902752042\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3492063492063492,\n \"acc_stderr\": 0.04263906892795132,\n \"acc_norm\": 0.3492063492063492,\n \"acc_norm_stderr\": 0.04263906892795132\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6387096774193548,\n \"acc_stderr\": 0.027327548447957546,\n \"acc_norm\": 0.6387096774193548,\n \"acc_norm_stderr\": 0.027327548447957546\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.41379310344827586,\n \"acc_stderr\": 0.034653044884067945,\n \"acc_norm\": 0.41379310344827586,\n \"acc_norm_stderr\": 0.034653044884067945\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.03453131801885415,\n \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.03453131801885415\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6767676767676768,\n \"acc_stderr\": 0.03332299921070644,\n \"acc_norm\": 0.6767676767676768,\n \"acc_norm_stderr\": 0.03332299921070644\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.772020725388601,\n \"acc_stderr\": 0.03027690994517826,\n \"acc_norm\": 0.772020725388601,\n \"acc_norm_stderr\": 0.03027690994517826\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5025641025641026,\n \"acc_stderr\": 0.025350672979412188,\n \"acc_norm\": 0.5025641025641026,\n \"acc_norm_stderr\": 0.025350672979412188\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2740740740740741,\n \"acc_stderr\": 0.027195934804085622,\n \"acc_norm\": 0.2740740740740741,\n \"acc_norm_stderr\": 0.027195934804085622\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5294117647058824,\n \"acc_stderr\": 0.03242225027115006,\n \"acc_norm\": 0.5294117647058824,\n \"acc_norm_stderr\": 0.03242225027115006\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7504587155963303,\n \"acc_stderr\": 0.01855389762950163,\n \"acc_norm\": 0.7504587155963303,\n \"acc_norm_stderr\": 0.01855389762950163\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4351851851851852,\n \"acc_stderr\": 0.03381200005643524,\n \"acc_norm\": 0.4351851851851852,\n \"acc_norm_stderr\": 0.03381200005643524\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7303921568627451,\n \"acc_stderr\": 0.031145570659486782,\n \"acc_norm\": 0.7303921568627451,\n \"acc_norm_stderr\": 0.031145570659486782\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7341772151898734,\n \"acc_stderr\": 0.02875679962965834,\n \"acc_norm\": 0.7341772151898734,\n \"acc_norm_stderr\": 0.02875679962965834\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6233183856502242,\n \"acc_stderr\": 0.03252113489929188,\n \"acc_norm\": 0.6233183856502242,\n \"acc_norm_stderr\": 0.03252113489929188\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6030534351145038,\n \"acc_stderr\": 0.04291135671009225,\n \"acc_norm\": 0.6030534351145038,\n \"acc_norm_stderr\": 0.04291135671009225\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.71900826446281,\n \"acc_stderr\": 0.04103203830514512,\n \"acc_norm\": 0.71900826446281,\n \"acc_norm_stderr\": 0.04103203830514512\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6018518518518519,\n \"acc_stderr\": 0.04732332615978813,\n \"acc_norm\": 0.6018518518518519,\n \"acc_norm_stderr\": 0.04732332615978813\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6134969325153374,\n \"acc_stderr\": 0.03825825548848607,\n \"acc_norm\": 0.6134969325153374,\n \"acc_norm_stderr\": 0.03825825548848607\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.027236013946196697,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.027236013946196697\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7484035759897829,\n \"acc_stderr\": 0.01551732236552963,\n \"acc_norm\": 0.7484035759897829,\n \"acc_norm_stderr\": 0.01551732236552963\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5895953757225434,\n \"acc_stderr\": 0.026483392042098174,\n \"acc_norm\": 0.5895953757225434,\n \"acc_norm_stderr\": 0.026483392042098174\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2782122905027933,\n \"acc_stderr\": 0.014987325439963551,\n \"acc_norm\": 0.2782122905027933,\n \"acc_norm_stderr\": 0.014987325439963551\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5849673202614379,\n \"acc_stderr\": 0.028213504177824103,\n \"acc_norm\": 0.5849673202614379,\n \"acc_norm_stderr\": 0.028213504177824103\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6109324758842444,\n \"acc_stderr\": 0.027690337536485372,\n \"acc_norm\": 0.6109324758842444,\n \"acc_norm_stderr\": 0.027690337536485372\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6635802469135802,\n \"acc_stderr\": 0.026289734945952922,\n \"acc_norm\": 0.6635802469135802,\n \"acc_norm_stderr\": 0.026289734945952922\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.41843971631205673,\n \"acc_stderr\": 0.029427994039419998,\n \"acc_norm\": 0.41843971631205673,\n \"acc_norm_stderr\": 0.029427994039419998\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4172099087353325,\n \"acc_stderr\": 0.012593959992906429,\n \"acc_norm\": 0.4172099087353325,\n \"acc_norm_stderr\": 0.012593959992906429\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5808823529411765,\n \"acc_stderr\": 0.029972807170464622,\n \"acc_norm\": 0.5808823529411765,\n \"acc_norm_stderr\": 0.029972807170464622\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5424836601307189,\n \"acc_stderr\": 0.02015468571259089,\n \"acc_norm\": 0.5424836601307189,\n \"acc_norm_stderr\": 0.02015468571259089\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5909090909090909,\n \"acc_stderr\": 0.04709306978661896,\n \"acc_norm\": 0.5909090909090909,\n \"acc_norm_stderr\": 0.04709306978661896\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.5102040816326531,\n \"acc_stderr\": 0.03200255347893783,\n \"acc_norm\": 0.5102040816326531,\n \"acc_norm_stderr\": 0.03200255347893783\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7114427860696517,\n \"acc_stderr\": 0.03203841040213321,\n \"acc_norm\": 0.7114427860696517,\n \"acc_norm_stderr\": 0.03203841040213321\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542129,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542129\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.45180722891566266,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.45180722891566266,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7426900584795322,\n \"acc_stderr\": 0.03352799844161865,\n \"acc_norm\": 0.7426900584795322,\n \"acc_norm_stderr\": 0.03352799844161865\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.44920440636474906,\n \"mc1_stderr\": 0.01741294198611531,\n \"mc2\": 0.619920564120794,\n \"mc2_stderr\": 0.01593484036504592\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7024467245461721,\n \"acc_stderr\": 0.012849085254614654\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.16603487490523122,\n \"acc_stderr\": 0.01024981199059352\n }\n}\n```", "repo_url": "https://huggingface.co/vicgalle/SOLAR-13B-Instruct-v1.0", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_13T23_03_16.622437", "path": ["**/details_harness|arc:challenge|25_2024-01-13T23-03-16.622437.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-13T23-03-16.622437.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_13T23_03_16.622437", "path": ["**/details_harness|gsm8k|5_2024-01-13T23-03-16.622437.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-13T23-03-16.622437.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_13T23_03_16.622437", "path": ["**/details_harness|hellaswag|10_2024-01-13T23-03-16.622437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-13T23-03-16.622437.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_13T23_03_16.622437", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T23-03-16.622437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T23-03-16.622437.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T23-03-16.622437.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_13T23_03_16.622437", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T23-03-16.622437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T23-03-16.622437.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_13T23_03_16.622437", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T23-03-16.622437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T23-03-16.622437.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_13T23_03_16.622437", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T23-03-16.622437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T23-03-16.622437.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_13T23_03_16.622437", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T23-03-16.622437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T23-03-16.622437.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_13T23_03_16.622437", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T23-03-16.622437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T23-03-16.622437.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_13T23_03_16.622437", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T23-03-16.622437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T23-03-16.622437.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_13T23_03_16.622437", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T23-03-16.622437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T23-03-16.622437.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_13T23_03_16.622437", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T23-03-16.622437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T23-03-16.622437.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_13T23_03_16.622437", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T23-03-16.622437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T23-03-16.622437.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_13T23_03_16.622437", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T23-03-16.622437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T23-03-16.622437.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_13T23_03_16.622437", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T23-03-16.622437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T23-03-16.622437.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_13T23_03_16.622437", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T23-03-16.622437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T23-03-16.622437.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_13T23_03_16.622437", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T23-03-16.622437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T23-03-16.622437.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_13T23_03_16.622437", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T23-03-16.622437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T23-03-16.622437.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_13T23_03_16.622437", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T23-03-16.622437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T23-03-16.622437.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_13T23_03_16.622437", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T23-03-16.622437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T23-03-16.622437.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_13T23_03_16.622437", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T23-03-16.622437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T23-03-16.622437.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_13T23_03_16.622437", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T23-03-16.622437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T23-03-16.622437.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_13T23_03_16.622437", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T23-03-16.622437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T23-03-16.622437.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_13T23_03_16.622437", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T23-03-16.622437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T23-03-16.622437.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_13T23_03_16.622437", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T23-03-16.622437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T23-03-16.622437.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_13T23_03_16.622437", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T23-03-16.622437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T23-03-16.622437.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_13T23_03_16.622437", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T23-03-16.622437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T23-03-16.622437.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_13T23_03_16.622437", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T23-03-16.622437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T23-03-16.622437.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_13T23_03_16.622437", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T23-03-16.622437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T23-03-16.622437.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_13T23_03_16.622437", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T23-03-16.622437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T23-03-16.622437.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_13T23_03_16.622437", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T23-03-16.622437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T23-03-16.622437.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_13T23_03_16.622437", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T23-03-16.622437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T23-03-16.622437.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_13T23_03_16.622437", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T23-03-16.622437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T23-03-16.622437.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_13T23_03_16.622437", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T23-03-16.622437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T23-03-16.622437.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_13T23_03_16.622437", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T23-03-16.622437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T23-03-16.622437.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_13T23_03_16.622437", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T23-03-16.622437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T23-03-16.622437.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_13T23_03_16.622437", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T23-03-16.622437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T23-03-16.622437.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_13T23_03_16.622437", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T23-03-16.622437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T23-03-16.622437.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_13T23_03_16.622437", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T23-03-16.622437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T23-03-16.622437.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_13T23_03_16.622437", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T23-03-16.622437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T23-03-16.622437.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_13T23_03_16.622437", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T23-03-16.622437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T23-03-16.622437.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_13T23_03_16.622437", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T23-03-16.622437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T23-03-16.622437.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_13T23_03_16.622437", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T23-03-16.622437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T23-03-16.622437.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_13T23_03_16.622437", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T23-03-16.622437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T23-03-16.622437.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_13T23_03_16.622437", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T23-03-16.622437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T23-03-16.622437.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_13T23_03_16.622437", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T23-03-16.622437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T23-03-16.622437.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_13T23_03_16.622437", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T23-03-16.622437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T23-03-16.622437.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_13T23_03_16.622437", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T23-03-16.622437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T23-03-16.622437.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_13T23_03_16.622437", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T23-03-16.622437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T23-03-16.622437.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_13T23_03_16.622437", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T23-03-16.622437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T23-03-16.622437.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_13T23_03_16.622437", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T23-03-16.622437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T23-03-16.622437.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_13T23_03_16.622437", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T23-03-16.622437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T23-03-16.622437.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_13T23_03_16.622437", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T23-03-16.622437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T23-03-16.622437.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_13T23_03_16.622437", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T23-03-16.622437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T23-03-16.622437.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_13T23_03_16.622437", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T23-03-16.622437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T23-03-16.622437.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_13T23_03_16.622437", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T23-03-16.622437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T23-03-16.622437.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_13T23_03_16.622437", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T23-03-16.622437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T23-03-16.622437.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_13T23_03_16.622437", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T23-03-16.622437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T23-03-16.622437.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_13T23_03_16.622437", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T23-03-16.622437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T23-03-16.622437.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_13T23_03_16.622437", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T23-03-16.622437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T23-03-16.622437.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_13T23_03_16.622437", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T23-03-16.622437.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T23-03-16.622437.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_13T23_03_16.622437", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T23-03-16.622437.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T23-03-16.622437.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_13T23_03_16.622437", "path": ["**/details_harness|winogrande|5_2024-01-13T23-03-16.622437.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-13T23-03-16.622437.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_13T23_03_16.622437", "path": ["results_2024-01-13T23-03-16.622437.parquet"]}, {"split": "latest", "path": ["results_2024-01-13T23-03-16.622437.parquet"]}]}]} | 2024-01-13T23:05:54+00:00 |
9a06e685c89286eb2bb8026411d150532f86ccb3 | # IndirectRequests
IndirectRequests is an LLM-generated dataset of user utterances in a task-oriented dialogue setting where the user does not directly specify their preferred slot value.
IndirectRequests was generated by crowdsourcing human labels over a dataset generated using a combination of GPT-3.5 (turbo) and GPT-4.
Each utterance is labelled along two dimensions:
1. World Understanding (the degree of world understanding it takes to understand the utterance)
2. Unambiguity (whether or not the generated utterance unambiguously entails a single target slot value among a set of candidate possible values).
---
license: mit
size_categories:
- n<1K
task_categories:
- text-classification
- conversational
- text-generation
pretty_name: IndirectRequests
configs:
- config_name: target_slot_value
data_files:
- split: train
path: data/train_target_slot_value.jsonl
- split: validation
path: data/validation_target_slot_value.jsonl
- split: test
path: data/test_target_slot_value.jsonl
- config_name: mean_world_understanding
data_files:
- split: train
path: data/train_mean_world_understanding.jsonl
- split: validation
path: data/validation_mean_world_understanding.jsonl
- split: test
path: data/test_mean_world_understanding.jsonl
---
| msamogh/indirect-requests | [
"task_categories:text-classification",
"task_categories:text-generation",
"task_categories:conversational",
"size_categories:n<1K",
"language:en",
"license:apache-2.0",
"region:us"
] | 2024-01-13T23:06:21+00:00 | {"language": ["en"], "license": "apache-2.0", "size_categories": ["n<1K"], "task_categories": ["text-classification", "text-generation", "conversational"], "pretty_name": "IIU-ToD"} | 2024-02-02T06:23:35+00:00 |
2c20cf64bb43e44752163d5bba61af55555daa1a | icaro23/icaroGC2 | [
"region:us"
] | 2024-01-13T23:11:51+00:00 | {} | 2024-01-13T23:11:51+00:00 |
|
7757e74a4a4669d0fba2c36e1687ece310b82b9f | icaro23/icaroGCO2 | [
"license:apache-2.0",
"region:us"
] | 2024-01-13T23:12:26+00:00 | {"license": "apache-2.0"} | 2024-01-13T23:16:22+00:00 |
|
b93c2d99508db7dd79aa911a780ecbbde8d508d7 | didius2006/coringagpu | [
"license:openrail",
"region:us"
] | 2024-01-13T23:16:28+00:00 | {"license": "openrail"} | 2024-01-13T23:19:10+00:00 |
|
f80ca395f174a2333ef5db46a217954492b010c7 |
# Dataset Card for Evaluation run of TeeZee/2xbagel-dpo-34b-v0.2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [TeeZee/2xbagel-dpo-34b-v0.2](https://huggingface.co/TeeZee/2xbagel-dpo-34b-v0.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TeeZee__2xbagel-dpo-34b-v0.2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T23:15:59.619735](https://huggingface.co/datasets/open-llm-leaderboard/details_TeeZee__2xbagel-dpo-34b-v0.2/blob/main/results_2024-01-13T23-15-59.619735.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7214725397684685,
"acc_stderr": 0.029456464928054458,
"acc_norm": 0.7359963920471002,
"acc_norm_stderr": 0.030168902390549673,
"mc1": 0.5018359853121175,
"mc1_stderr": 0.017503383046877048,
"mc2": 0.6715187545754473,
"mc2_stderr": 0.015523811623029661
},
"harness|arc:challenge|25": {
"acc": 0.6356655290102389,
"acc_stderr": 0.014063260279882417,
"acc_norm": 0.6527303754266212,
"acc_norm_stderr": 0.013913034529620458
},
"harness|hellaswag|10": {
"acc": 0.6113324039036049,
"acc_stderr": 0.004864513262194309,
"acc_norm": 0.7934674367655845,
"acc_norm_stderr": 0.004039897423689437
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.674074074074074,
"acc_stderr": 0.040491220417025055,
"acc_norm": 0.674074074074074,
"acc_norm_stderr": 0.040491220417025055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8289473684210527,
"acc_stderr": 0.030643607071677084,
"acc_norm": 0.8289473684210527,
"acc_norm_stderr": 0.030643607071677084
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036843,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036843
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7735849056603774,
"acc_stderr": 0.025757559893106737,
"acc_norm": 0.7735849056603774,
"acc_norm_stderr": 0.025757559893106737
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8819444444444444,
"acc_stderr": 0.02698334650330939,
"acc_norm": 0.8819444444444444,
"acc_norm_stderr": 0.02698334650330939
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6994219653179191,
"acc_stderr": 0.0349610148119118,
"acc_norm": 0.6994219653179191,
"acc_norm_stderr": 0.0349610148119118
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.49019607843137253,
"acc_stderr": 0.04974229460422817,
"acc_norm": 0.49019607843137253,
"acc_norm_stderr": 0.04974229460422817
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7531914893617021,
"acc_stderr": 0.02818544130123409,
"acc_norm": 0.7531914893617021,
"acc_norm_stderr": 0.02818544130123409
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5350877192982456,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.5350877192982456,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6689655172413793,
"acc_stderr": 0.03921545312467122,
"acc_norm": 0.6689655172413793,
"acc_norm_stderr": 0.03921545312467122
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.6507936507936508,
"acc_stderr": 0.02455229220934266,
"acc_norm": 0.6507936507936508,
"acc_norm_stderr": 0.02455229220934266
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5476190476190477,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.5476190476190477,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8935483870967742,
"acc_stderr": 0.017545102951656635,
"acc_norm": 0.8935483870967742,
"acc_norm_stderr": 0.017545102951656635
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5960591133004927,
"acc_stderr": 0.03452453903822032,
"acc_norm": 0.5960591133004927,
"acc_norm_stderr": 0.03452453903822032
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.78,
"acc_stderr": 0.041633319989322626,
"acc_norm": 0.78,
"acc_norm_stderr": 0.041633319989322626
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8303030303030303,
"acc_stderr": 0.02931118867498311,
"acc_norm": 0.8303030303030303,
"acc_norm_stderr": 0.02931118867498311
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9090909090909091,
"acc_stderr": 0.02048208677542421,
"acc_norm": 0.9090909090909091,
"acc_norm_stderr": 0.02048208677542421
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9585492227979274,
"acc_stderr": 0.014385432857476453,
"acc_norm": 0.9585492227979274,
"acc_norm_stderr": 0.014385432857476453
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7769230769230769,
"acc_stderr": 0.02110773012724399,
"acc_norm": 0.7769230769230769,
"acc_norm_stderr": 0.02110773012724399
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.02944316932303154,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.02944316932303154
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.024762902678057943,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.024762902678057943
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4304635761589404,
"acc_stderr": 0.04042809961395634,
"acc_norm": 0.4304635761589404,
"acc_norm_stderr": 0.04042809961395634
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9174311926605505,
"acc_stderr": 0.011800361363016567,
"acc_norm": 0.9174311926605505,
"acc_norm_stderr": 0.011800361363016567
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6620370370370371,
"acc_stderr": 0.03225941352631295,
"acc_norm": 0.6620370370370371,
"acc_norm_stderr": 0.03225941352631295
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8970588235294118,
"acc_stderr": 0.021328337570804365,
"acc_norm": 0.8970588235294118,
"acc_norm_stderr": 0.021328337570804365
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8481012658227848,
"acc_stderr": 0.023363878096632446,
"acc_norm": 0.8481012658227848,
"acc_norm_stderr": 0.023363878096632446
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7757847533632287,
"acc_stderr": 0.02799153425851952,
"acc_norm": 0.7757847533632287,
"acc_norm_stderr": 0.02799153425851952
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8549618320610687,
"acc_stderr": 0.030884661089515375,
"acc_norm": 0.8549618320610687,
"acc_norm_stderr": 0.030884661089515375
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8512396694214877,
"acc_stderr": 0.03248470083807193,
"acc_norm": 0.8512396694214877,
"acc_norm_stderr": 0.03248470083807193
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8240740740740741,
"acc_stderr": 0.03680918141673883,
"acc_norm": 0.8240740740740741,
"acc_norm_stderr": 0.03680918141673883
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8404907975460123,
"acc_stderr": 0.02876748172598387,
"acc_norm": 0.8404907975460123,
"acc_norm_stderr": 0.02876748172598387
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.04697113923010213,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.04697113923010213
},
"harness|hendrycksTest-management|5": {
"acc": 0.8543689320388349,
"acc_stderr": 0.03492606476623791,
"acc_norm": 0.8543689320388349,
"acc_norm_stderr": 0.03492606476623791
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9145299145299145,
"acc_stderr": 0.018315891685625845,
"acc_norm": 0.9145299145299145,
"acc_norm_stderr": 0.018315891685625845
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.89272030651341,
"acc_stderr": 0.011066571449508435,
"acc_norm": 0.89272030651341,
"acc_norm_stderr": 0.011066571449508435
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7976878612716763,
"acc_stderr": 0.021628077380196124,
"acc_norm": 0.7976878612716763,
"acc_norm_stderr": 0.021628077380196124
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.729608938547486,
"acc_stderr": 0.014854993938010081,
"acc_norm": 0.729608938547486,
"acc_norm_stderr": 0.014854993938010081
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8071895424836601,
"acc_stderr": 0.02258931888817668,
"acc_norm": 0.8071895424836601,
"acc_norm_stderr": 0.02258931888817668
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8070739549839229,
"acc_stderr": 0.022411516780911366,
"acc_norm": 0.8070739549839229,
"acc_norm_stderr": 0.022411516780911366
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8179012345679012,
"acc_stderr": 0.02147349183480833,
"acc_norm": 0.8179012345679012,
"acc_norm_stderr": 0.02147349183480833
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6382978723404256,
"acc_stderr": 0.02866382014719949,
"acc_norm": 0.6382978723404256,
"acc_norm_stderr": 0.02866382014719949
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5501955671447197,
"acc_stderr": 0.012705721498564972,
"acc_norm": 0.5501955671447197,
"acc_norm_stderr": 0.012705721498564972
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7867647058823529,
"acc_stderr": 0.024880971512294243,
"acc_norm": 0.7867647058823529,
"acc_norm_stderr": 0.024880971512294243
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7908496732026143,
"acc_stderr": 0.016453399332279326,
"acc_norm": 0.7908496732026143,
"acc_norm_stderr": 0.016453399332279326
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7836734693877551,
"acc_stderr": 0.026358916334904045,
"acc_norm": 0.7836734693877551,
"acc_norm_stderr": 0.026358916334904045
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8905472636815921,
"acc_stderr": 0.022076326101824664,
"acc_norm": 0.8905472636815921,
"acc_norm_stderr": 0.022076326101824664
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.032659863237109066,
"acc_norm": 0.88,
"acc_norm_stderr": 0.032659863237109066
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8596491228070176,
"acc_stderr": 0.026640582539133196,
"acc_norm": 0.8596491228070176,
"acc_norm_stderr": 0.026640582539133196
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5018359853121175,
"mc1_stderr": 0.017503383046877048,
"mc2": 0.6715187545754473,
"mc2_stderr": 0.015523811623029661
},
"harness|winogrande|5": {
"acc": 0.7640094711917916,
"acc_stderr": 0.011933828850275626
},
"harness|gsm8k|5": {
"acc": 0.02122820318423048,
"acc_stderr": 0.003970449129848635
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_TeeZee__2xbagel-dpo-34b-v0.2 | [
"region:us"
] | 2024-01-13T23:18:13+00:00 | {"pretty_name": "Evaluation run of TeeZee/2xbagel-dpo-34b-v0.2", "dataset_summary": "Dataset automatically created during the evaluation run of model [TeeZee/2xbagel-dpo-34b-v0.2](https://huggingface.co/TeeZee/2xbagel-dpo-34b-v0.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TeeZee__2xbagel-dpo-34b-v0.2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-13T23:15:59.619735](https://huggingface.co/datasets/open-llm-leaderboard/details_TeeZee__2xbagel-dpo-34b-v0.2/blob/main/results_2024-01-13T23-15-59.619735.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7214725397684685,\n \"acc_stderr\": 0.029456464928054458,\n \"acc_norm\": 0.7359963920471002,\n \"acc_norm_stderr\": 0.030168902390549673,\n \"mc1\": 0.5018359853121175,\n \"mc1_stderr\": 0.017503383046877048,\n \"mc2\": 0.6715187545754473,\n \"mc2_stderr\": 0.015523811623029661\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6356655290102389,\n \"acc_stderr\": 0.014063260279882417,\n \"acc_norm\": 0.6527303754266212,\n \"acc_norm_stderr\": 0.013913034529620458\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6113324039036049,\n \"acc_stderr\": 0.004864513262194309,\n \"acc_norm\": 0.7934674367655845,\n \"acc_norm_stderr\": 0.004039897423689437\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.674074074074074,\n \"acc_stderr\": 0.040491220417025055,\n \"acc_norm\": 0.674074074074074,\n \"acc_norm_stderr\": 0.040491220417025055\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.8289473684210527,\n \"acc_stderr\": 0.030643607071677084,\n \"acc_norm\": 0.8289473684210527,\n \"acc_norm_stderr\": 0.030643607071677084\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036843,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036843\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7735849056603774,\n \"acc_stderr\": 0.025757559893106737,\n \"acc_norm\": 0.7735849056603774,\n \"acc_norm_stderr\": 0.025757559893106737\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8819444444444444,\n \"acc_stderr\": 0.02698334650330939,\n \"acc_norm\": 0.8819444444444444,\n \"acc_norm_stderr\": 0.02698334650330939\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6994219653179191,\n \"acc_stderr\": 0.0349610148119118,\n \"acc_norm\": 0.6994219653179191,\n \"acc_norm_stderr\": 0.0349610148119118\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.49019607843137253,\n \"acc_stderr\": 0.04974229460422817,\n \"acc_norm\": 0.49019607843137253,\n \"acc_norm_stderr\": 0.04974229460422817\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7531914893617021,\n \"acc_stderr\": 0.02818544130123409,\n \"acc_norm\": 0.7531914893617021,\n \"acc_norm_stderr\": 0.02818544130123409\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5350877192982456,\n \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.5350877192982456,\n \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6689655172413793,\n \"acc_stderr\": 0.03921545312467122,\n \"acc_norm\": 0.6689655172413793,\n \"acc_norm_stderr\": 0.03921545312467122\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.6507936507936508,\n \"acc_stderr\": 0.02455229220934266,\n \"acc_norm\": 0.6507936507936508,\n \"acc_norm_stderr\": 0.02455229220934266\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5476190476190477,\n \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.5476190476190477,\n \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8935483870967742,\n \"acc_stderr\": 0.017545102951656635,\n \"acc_norm\": 0.8935483870967742,\n \"acc_norm_stderr\": 0.017545102951656635\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5960591133004927,\n \"acc_stderr\": 0.03452453903822032,\n \"acc_norm\": 0.5960591133004927,\n \"acc_norm_stderr\": 0.03452453903822032\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.041633319989322626,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.041633319989322626\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8303030303030303,\n \"acc_stderr\": 0.02931118867498311,\n \"acc_norm\": 0.8303030303030303,\n \"acc_norm_stderr\": 0.02931118867498311\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9090909090909091,\n \"acc_stderr\": 0.02048208677542421,\n \"acc_norm\": 0.9090909090909091,\n \"acc_norm_stderr\": 0.02048208677542421\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9585492227979274,\n \"acc_stderr\": 0.014385432857476453,\n \"acc_norm\": 0.9585492227979274,\n \"acc_norm_stderr\": 0.014385432857476453\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.7769230769230769,\n \"acc_stderr\": 0.02110773012724399,\n \"acc_norm\": 0.7769230769230769,\n \"acc_norm_stderr\": 0.02110773012724399\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.37037037037037035,\n \"acc_stderr\": 0.02944316932303154,\n \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.02944316932303154\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8235294117647058,\n \"acc_stderr\": 0.024762902678057943,\n \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.024762902678057943\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.4304635761589404,\n \"acc_stderr\": 0.04042809961395634,\n \"acc_norm\": 0.4304635761589404,\n \"acc_norm_stderr\": 0.04042809961395634\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9174311926605505,\n \"acc_stderr\": 0.011800361363016567,\n \"acc_norm\": 0.9174311926605505,\n \"acc_norm_stderr\": 0.011800361363016567\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6620370370370371,\n \"acc_stderr\": 0.03225941352631295,\n \"acc_norm\": 0.6620370370370371,\n \"acc_norm_stderr\": 0.03225941352631295\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8970588235294118,\n \"acc_stderr\": 0.021328337570804365,\n \"acc_norm\": 0.8970588235294118,\n \"acc_norm_stderr\": 0.021328337570804365\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8481012658227848,\n \"acc_stderr\": 0.023363878096632446,\n \"acc_norm\": 0.8481012658227848,\n \"acc_norm_stderr\": 0.023363878096632446\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7757847533632287,\n \"acc_stderr\": 0.02799153425851952,\n \"acc_norm\": 0.7757847533632287,\n \"acc_norm_stderr\": 0.02799153425851952\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8549618320610687,\n \"acc_stderr\": 0.030884661089515375,\n \"acc_norm\": 0.8549618320610687,\n \"acc_norm_stderr\": 0.030884661089515375\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8512396694214877,\n \"acc_stderr\": 0.03248470083807193,\n \"acc_norm\": 0.8512396694214877,\n \"acc_norm_stderr\": 0.03248470083807193\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8240740740740741,\n \"acc_stderr\": 0.03680918141673883,\n \"acc_norm\": 0.8240740740740741,\n \"acc_norm_stderr\": 0.03680918141673883\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8404907975460123,\n \"acc_stderr\": 0.02876748172598387,\n \"acc_norm\": 0.8404907975460123,\n \"acc_norm_stderr\": 0.02876748172598387\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5714285714285714,\n \"acc_stderr\": 0.04697113923010213,\n \"acc_norm\": 0.5714285714285714,\n \"acc_norm_stderr\": 0.04697113923010213\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8543689320388349,\n \"acc_stderr\": 0.03492606476623791,\n \"acc_norm\": 0.8543689320388349,\n \"acc_norm_stderr\": 0.03492606476623791\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9145299145299145,\n \"acc_stderr\": 0.018315891685625845,\n \"acc_norm\": 0.9145299145299145,\n \"acc_norm_stderr\": 0.018315891685625845\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.89272030651341,\n \"acc_stderr\": 0.011066571449508435,\n \"acc_norm\": 0.89272030651341,\n \"acc_norm_stderr\": 0.011066571449508435\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7976878612716763,\n \"acc_stderr\": 0.021628077380196124,\n \"acc_norm\": 0.7976878612716763,\n \"acc_norm_stderr\": 0.021628077380196124\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.729608938547486,\n \"acc_stderr\": 0.014854993938010081,\n \"acc_norm\": 0.729608938547486,\n \"acc_norm_stderr\": 0.014854993938010081\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8071895424836601,\n \"acc_stderr\": 0.02258931888817668,\n \"acc_norm\": 0.8071895424836601,\n \"acc_norm_stderr\": 0.02258931888817668\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8070739549839229,\n \"acc_stderr\": 0.022411516780911366,\n \"acc_norm\": 0.8070739549839229,\n \"acc_norm_stderr\": 0.022411516780911366\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8179012345679012,\n \"acc_stderr\": 0.02147349183480833,\n \"acc_norm\": 0.8179012345679012,\n \"acc_norm_stderr\": 0.02147349183480833\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.6382978723404256,\n \"acc_stderr\": 0.02866382014719949,\n \"acc_norm\": 0.6382978723404256,\n \"acc_norm_stderr\": 0.02866382014719949\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5501955671447197,\n \"acc_stderr\": 0.012705721498564972,\n \"acc_norm\": 0.5501955671447197,\n \"acc_norm_stderr\": 0.012705721498564972\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7867647058823529,\n \"acc_stderr\": 0.024880971512294243,\n \"acc_norm\": 0.7867647058823529,\n \"acc_norm_stderr\": 0.024880971512294243\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.7908496732026143,\n \"acc_stderr\": 0.016453399332279326,\n \"acc_norm\": 0.7908496732026143,\n \"acc_norm_stderr\": 0.016453399332279326\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7836734693877551,\n \"acc_stderr\": 0.026358916334904045,\n \"acc_norm\": 0.7836734693877551,\n \"acc_norm_stderr\": 0.026358916334904045\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8905472636815921,\n \"acc_stderr\": 0.022076326101824664,\n \"acc_norm\": 0.8905472636815921,\n \"acc_norm_stderr\": 0.022076326101824664\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.88,\n \"acc_stderr\": 0.032659863237109066,\n \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.032659863237109066\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8596491228070176,\n \"acc_stderr\": 0.026640582539133196,\n \"acc_norm\": 0.8596491228070176,\n \"acc_norm_stderr\": 0.026640582539133196\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5018359853121175,\n \"mc1_stderr\": 0.017503383046877048,\n \"mc2\": 0.6715187545754473,\n \"mc2_stderr\": 0.015523811623029661\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7640094711917916,\n \"acc_stderr\": 0.011933828850275626\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.02122820318423048,\n \"acc_stderr\": 0.003970449129848635\n }\n}\n```", "repo_url": "https://huggingface.co/TeeZee/2xbagel-dpo-34b-v0.2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_13T23_15_59.619735", "path": ["**/details_harness|arc:challenge|25_2024-01-13T23-15-59.619735.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-13T23-15-59.619735.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_13T23_15_59.619735", "path": ["**/details_harness|gsm8k|5_2024-01-13T23-15-59.619735.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-13T23-15-59.619735.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_13T23_15_59.619735", "path": ["**/details_harness|hellaswag|10_2024-01-13T23-15-59.619735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-13T23-15-59.619735.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_13T23_15_59.619735", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T23-15-59.619735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T23-15-59.619735.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T23-15-59.619735.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_13T23_15_59.619735", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T23-15-59.619735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T23-15-59.619735.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_13T23_15_59.619735", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T23-15-59.619735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T23-15-59.619735.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_13T23_15_59.619735", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T23-15-59.619735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T23-15-59.619735.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_13T23_15_59.619735", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T23-15-59.619735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T23-15-59.619735.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_13T23_15_59.619735", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T23-15-59.619735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T23-15-59.619735.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_13T23_15_59.619735", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T23-15-59.619735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T23-15-59.619735.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_13T23_15_59.619735", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T23-15-59.619735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T23-15-59.619735.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_13T23_15_59.619735", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T23-15-59.619735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T23-15-59.619735.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_13T23_15_59.619735", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T23-15-59.619735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T23-15-59.619735.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_13T23_15_59.619735", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T23-15-59.619735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T23-15-59.619735.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_13T23_15_59.619735", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T23-15-59.619735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T23-15-59.619735.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_13T23_15_59.619735", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T23-15-59.619735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T23-15-59.619735.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_13T23_15_59.619735", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T23-15-59.619735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T23-15-59.619735.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_13T23_15_59.619735", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T23-15-59.619735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T23-15-59.619735.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_13T23_15_59.619735", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T23-15-59.619735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T23-15-59.619735.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_13T23_15_59.619735", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T23-15-59.619735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T23-15-59.619735.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_13T23_15_59.619735", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T23-15-59.619735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T23-15-59.619735.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_13T23_15_59.619735", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T23-15-59.619735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T23-15-59.619735.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_13T23_15_59.619735", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T23-15-59.619735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T23-15-59.619735.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_13T23_15_59.619735", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T23-15-59.619735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T23-15-59.619735.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_13T23_15_59.619735", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T23-15-59.619735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T23-15-59.619735.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_13T23_15_59.619735", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T23-15-59.619735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T23-15-59.619735.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_13T23_15_59.619735", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T23-15-59.619735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T23-15-59.619735.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_13T23_15_59.619735", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T23-15-59.619735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T23-15-59.619735.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_13T23_15_59.619735", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T23-15-59.619735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T23-15-59.619735.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_13T23_15_59.619735", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T23-15-59.619735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T23-15-59.619735.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_13T23_15_59.619735", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T23-15-59.619735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T23-15-59.619735.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_13T23_15_59.619735", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T23-15-59.619735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T23-15-59.619735.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_13T23_15_59.619735", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T23-15-59.619735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T23-15-59.619735.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_13T23_15_59.619735", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T23-15-59.619735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T23-15-59.619735.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_13T23_15_59.619735", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T23-15-59.619735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T23-15-59.619735.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_13T23_15_59.619735", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T23-15-59.619735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T23-15-59.619735.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_13T23_15_59.619735", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T23-15-59.619735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T23-15-59.619735.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_13T23_15_59.619735", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T23-15-59.619735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T23-15-59.619735.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_13T23_15_59.619735", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T23-15-59.619735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T23-15-59.619735.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_13T23_15_59.619735", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T23-15-59.619735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T23-15-59.619735.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_13T23_15_59.619735", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T23-15-59.619735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T23-15-59.619735.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_13T23_15_59.619735", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T23-15-59.619735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T23-15-59.619735.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_13T23_15_59.619735", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T23-15-59.619735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T23-15-59.619735.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_13T23_15_59.619735", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T23-15-59.619735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T23-15-59.619735.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_13T23_15_59.619735", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T23-15-59.619735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T23-15-59.619735.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_13T23_15_59.619735", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T23-15-59.619735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T23-15-59.619735.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_13T23_15_59.619735", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T23-15-59.619735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T23-15-59.619735.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_13T23_15_59.619735", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T23-15-59.619735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T23-15-59.619735.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_13T23_15_59.619735", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T23-15-59.619735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T23-15-59.619735.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_13T23_15_59.619735", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T23-15-59.619735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T23-15-59.619735.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_13T23_15_59.619735", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T23-15-59.619735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T23-15-59.619735.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_13T23_15_59.619735", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T23-15-59.619735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T23-15-59.619735.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_13T23_15_59.619735", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T23-15-59.619735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T23-15-59.619735.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_13T23_15_59.619735", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T23-15-59.619735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T23-15-59.619735.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_13T23_15_59.619735", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T23-15-59.619735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T23-15-59.619735.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_13T23_15_59.619735", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T23-15-59.619735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T23-15-59.619735.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_13T23_15_59.619735", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T23-15-59.619735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T23-15-59.619735.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_13T23_15_59.619735", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T23-15-59.619735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T23-15-59.619735.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_13T23_15_59.619735", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T23-15-59.619735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T23-15-59.619735.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_13T23_15_59.619735", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T23-15-59.619735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T23-15-59.619735.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_13T23_15_59.619735", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T23-15-59.619735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T23-15-59.619735.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_13T23_15_59.619735", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T23-15-59.619735.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T23-15-59.619735.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_13T23_15_59.619735", "path": ["**/details_harness|winogrande|5_2024-01-13T23-15-59.619735.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-13T23-15-59.619735.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_13T23_15_59.619735", "path": ["results_2024-01-13T23-15-59.619735.parquet"]}, {"split": "latest", "path": ["results_2024-01-13T23-15-59.619735.parquet"]}]}]} | 2024-01-13T23:18:35+00:00 |
7d71fbfd61ce56f319afaa0efe74d243a21da081 |
# Dataset Card for Evaluation run of beowolx/MistralHermes-CodePro-7B-v1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [beowolx/MistralHermes-CodePro-7B-v1](https://huggingface.co/beowolx/MistralHermes-CodePro-7B-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_beowolx__MistralHermes-CodePro-7B-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T23:16:31.615360](https://huggingface.co/datasets/open-llm-leaderboard/details_beowolx__MistralHermes-CodePro-7B-v1/blob/main/results_2024-01-13T23-16-31.615360.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6355378468432605,
"acc_stderr": 0.03226341558486178,
"acc_norm": 0.6374815210840533,
"acc_norm_stderr": 0.03291019935178123,
"mc1": 0.3488372093023256,
"mc1_stderr": 0.016684419859986893,
"mc2": 0.4966549787597113,
"mc2_stderr": 0.015039415129128687
},
"harness|arc:challenge|25": {
"acc": 0.590443686006826,
"acc_stderr": 0.014370358632472435,
"acc_norm": 0.6245733788395904,
"acc_norm_stderr": 0.01415063143511173
},
"harness|hellaswag|10": {
"acc": 0.629555865365465,
"acc_stderr": 0.004819367172685959,
"acc_norm": 0.8268273252340171,
"acc_norm_stderr": 0.0037762314890081123
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411022,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411022
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6641509433962264,
"acc_stderr": 0.02906722014664483,
"acc_norm": 0.6641509433962264,
"acc_norm_stderr": 0.02906722014664483
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6069364161849711,
"acc_stderr": 0.0372424959581773,
"acc_norm": 0.6069364161849711,
"acc_norm_stderr": 0.0372424959581773
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082636,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108101,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108101
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.04164188720169375,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.04164188720169375
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.0255428468174005,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.0255428468174005
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.49206349206349204,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.49206349206349204,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7967741935483871,
"acc_stderr": 0.02289168798455495,
"acc_norm": 0.7967741935483871,
"acc_norm_stderr": 0.02289168798455495
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.65,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.793939393939394,
"acc_stderr": 0.03158415324047711,
"acc_norm": 0.793939393939394,
"acc_norm_stderr": 0.03158415324047711
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267042,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267042
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6205128205128205,
"acc_stderr": 0.02460362692409742,
"acc_norm": 0.6205128205128205,
"acc_norm_stderr": 0.02460362692409742
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.02831753349606648,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.02831753349606648
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526732,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526732
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8311926605504587,
"acc_stderr": 0.01606005626853034,
"acc_norm": 0.8311926605504587,
"acc_norm_stderr": 0.01606005626853034
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.03407632093854051,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.03407632093854051
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.028379449451588663,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.028379449451588663
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.025530100460233494,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.025530100460233494
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7040358744394619,
"acc_stderr": 0.030636591348699796,
"acc_norm": 0.7040358744394619,
"acc_norm_stderr": 0.030636591348699796
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990946,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990946
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7914110429447853,
"acc_stderr": 0.031921934489347235,
"acc_norm": 0.7914110429447853,
"acc_norm_stderr": 0.031921934489347235
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.02280138253459753,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.02280138253459753
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8173690932311622,
"acc_stderr": 0.013816335389973136,
"acc_norm": 0.8173690932311622,
"acc_norm_stderr": 0.013816335389973136
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.023948512905468348,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.023948512905468348
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2636871508379888,
"acc_stderr": 0.014736926383761974,
"acc_norm": 0.2636871508379888,
"acc_norm_stderr": 0.014736926383761974
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7483660130718954,
"acc_stderr": 0.024848018263875195,
"acc_norm": 0.7483660130718954,
"acc_norm_stderr": 0.024848018263875195
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.02592237178881877,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.02592237178881877
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.024659685185967294,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.024659685185967294
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4680573663624511,
"acc_stderr": 0.012744149704869647,
"acc_norm": 0.4680573663624511,
"acc_norm_stderr": 0.012744149704869647
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6727941176470589,
"acc_stderr": 0.028501452860396556,
"acc_norm": 0.6727941176470589,
"acc_norm_stderr": 0.028501452860396556
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6650326797385621,
"acc_stderr": 0.019094228167000318,
"acc_norm": 0.6650326797385621,
"acc_norm_stderr": 0.019094228167000318
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784593,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784593
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.02619392354445412,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.02619392354445412
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.03882310850890594,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.03882310850890594
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3488372093023256,
"mc1_stderr": 0.016684419859986893,
"mc2": 0.4966549787597113,
"mc2_stderr": 0.015039415129128687
},
"harness|winogrande|5": {
"acc": 0.7790055248618785,
"acc_stderr": 0.011661223637643412
},
"harness|gsm8k|5": {
"acc": 0.6087945413191812,
"acc_stderr": 0.013442502402794302
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_beowolx__MistralHermes-CodePro-7B-v1 | [
"region:us"
] | 2024-01-13T23:18:50+00:00 | {"pretty_name": "Evaluation run of beowolx/MistralHermes-CodePro-7B-v1", "dataset_summary": "Dataset automatically created during the evaluation run of model [beowolx/MistralHermes-CodePro-7B-v1](https://huggingface.co/beowolx/MistralHermes-CodePro-7B-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_beowolx__MistralHermes-CodePro-7B-v1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-13T23:16:31.615360](https://huggingface.co/datasets/open-llm-leaderboard/details_beowolx__MistralHermes-CodePro-7B-v1/blob/main/results_2024-01-13T23-16-31.615360.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6355378468432605,\n \"acc_stderr\": 0.03226341558486178,\n \"acc_norm\": 0.6374815210840533,\n \"acc_norm_stderr\": 0.03291019935178123,\n \"mc1\": 0.3488372093023256,\n \"mc1_stderr\": 0.016684419859986893,\n \"mc2\": 0.4966549787597113,\n \"mc2_stderr\": 0.015039415129128687\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.590443686006826,\n \"acc_stderr\": 0.014370358632472435,\n \"acc_norm\": 0.6245733788395904,\n \"acc_norm_stderr\": 0.01415063143511173\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.629555865365465,\n \"acc_stderr\": 0.004819367172685959,\n \"acc_norm\": 0.8268273252340171,\n \"acc_norm_stderr\": 0.0037762314890081123\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411022,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411022\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6641509433962264,\n \"acc_stderr\": 0.02906722014664483,\n \"acc_norm\": 0.6641509433962264,\n \"acc_norm_stderr\": 0.02906722014664483\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6069364161849711,\n \"acc_stderr\": 0.0372424959581773,\n \"acc_norm\": 0.6069364161849711,\n \"acc_norm_stderr\": 0.0372424959581773\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082636,\n \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082636\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108101,\n \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108101\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.04164188720169375,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.04164188720169375\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.0255428468174005,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.0255428468174005\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.49206349206349204,\n \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.49206349206349204,\n \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7967741935483871,\n \"acc_stderr\": 0.02289168798455495,\n \"acc_norm\": 0.7967741935483871,\n \"acc_norm_stderr\": 0.02289168798455495\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.04793724854411019,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.04793724854411019\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.03158415324047711,\n \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.03158415324047711\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267042,\n \"acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267042\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6205128205128205,\n \"acc_stderr\": 0.02460362692409742,\n \"acc_norm\": 0.6205128205128205,\n \"acc_norm_stderr\": 0.02460362692409742\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3148148148148148,\n \"acc_stderr\": 0.02831753349606648,\n \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.02831753349606648\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526732,\n \"acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526732\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8311926605504587,\n \"acc_stderr\": 0.01606005626853034,\n \"acc_norm\": 0.8311926605504587,\n \"acc_norm_stderr\": 0.01606005626853034\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5185185185185185,\n \"acc_stderr\": 0.03407632093854051,\n \"acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.03407632093854051\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588663,\n \"acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588663\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.810126582278481,\n \"acc_stderr\": 0.025530100460233494,\n \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.025530100460233494\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n \"acc_stderr\": 0.030636591348699796,\n \"acc_norm\": 0.7040358744394619,\n \"acc_norm_stderr\": 0.030636591348699796\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990946,\n \"acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990946\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.031921934489347235,\n \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.031921934489347235\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n \"acc_stderr\": 0.02280138253459753,\n \"acc_norm\": 0.8589743589743589,\n \"acc_norm_stderr\": 0.02280138253459753\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8173690932311622,\n \"acc_stderr\": 0.013816335389973136,\n \"acc_norm\": 0.8173690932311622,\n \"acc_norm_stderr\": 0.013816335389973136\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.023948512905468348,\n \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.023948512905468348\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2636871508379888,\n \"acc_stderr\": 0.014736926383761974,\n \"acc_norm\": 0.2636871508379888,\n \"acc_norm_stderr\": 0.014736926383761974\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7483660130718954,\n \"acc_stderr\": 0.024848018263875195,\n \"acc_norm\": 0.7483660130718954,\n \"acc_norm_stderr\": 0.024848018263875195\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n \"acc_stderr\": 0.02592237178881877,\n \"acc_norm\": 0.7041800643086816,\n \"acc_norm_stderr\": 0.02592237178881877\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.024659685185967294,\n \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.024659685185967294\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4680573663624511,\n \"acc_stderr\": 0.012744149704869647,\n \"acc_norm\": 0.4680573663624511,\n \"acc_norm_stderr\": 0.012744149704869647\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.028501452860396556,\n \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.028501452860396556\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6650326797385621,\n \"acc_stderr\": 0.019094228167000318,\n \"acc_norm\": 0.6650326797385621,\n \"acc_norm_stderr\": 0.019094228167000318\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.02619392354445412,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.02619392354445412\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n \"acc_stderr\": 0.03882310850890594,\n \"acc_norm\": 0.536144578313253,\n \"acc_norm_stderr\": 0.03882310850890594\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3488372093023256,\n \"mc1_stderr\": 0.016684419859986893,\n \"mc2\": 0.4966549787597113,\n \"mc2_stderr\": 0.015039415129128687\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7790055248618785,\n \"acc_stderr\": 0.011661223637643412\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6087945413191812,\n \"acc_stderr\": 0.013442502402794302\n }\n}\n```", "repo_url": "https://huggingface.co/beowolx/MistralHermes-CodePro-7B-v1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_13T23_16_31.615360", "path": ["**/details_harness|arc:challenge|25_2024-01-13T23-16-31.615360.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-13T23-16-31.615360.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_13T23_16_31.615360", "path": ["**/details_harness|gsm8k|5_2024-01-13T23-16-31.615360.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-13T23-16-31.615360.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_13T23_16_31.615360", "path": ["**/details_harness|hellaswag|10_2024-01-13T23-16-31.615360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-13T23-16-31.615360.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_13T23_16_31.615360", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T23-16-31.615360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T23-16-31.615360.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T23-16-31.615360.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_13T23_16_31.615360", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T23-16-31.615360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T23-16-31.615360.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_13T23_16_31.615360", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T23-16-31.615360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T23-16-31.615360.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_13T23_16_31.615360", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T23-16-31.615360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T23-16-31.615360.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_13T23_16_31.615360", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T23-16-31.615360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T23-16-31.615360.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_13T23_16_31.615360", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T23-16-31.615360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T23-16-31.615360.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_13T23_16_31.615360", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T23-16-31.615360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T23-16-31.615360.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_13T23_16_31.615360", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T23-16-31.615360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T23-16-31.615360.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_13T23_16_31.615360", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T23-16-31.615360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T23-16-31.615360.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_13T23_16_31.615360", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T23-16-31.615360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T23-16-31.615360.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_13T23_16_31.615360", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T23-16-31.615360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T23-16-31.615360.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_13T23_16_31.615360", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T23-16-31.615360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T23-16-31.615360.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_13T23_16_31.615360", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T23-16-31.615360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T23-16-31.615360.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_13T23_16_31.615360", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T23-16-31.615360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T23-16-31.615360.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_13T23_16_31.615360", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T23-16-31.615360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T23-16-31.615360.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_13T23_16_31.615360", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T23-16-31.615360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T23-16-31.615360.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_13T23_16_31.615360", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T23-16-31.615360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T23-16-31.615360.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_13T23_16_31.615360", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T23-16-31.615360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T23-16-31.615360.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_13T23_16_31.615360", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T23-16-31.615360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T23-16-31.615360.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_13T23_16_31.615360", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T23-16-31.615360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T23-16-31.615360.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_13T23_16_31.615360", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T23-16-31.615360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T23-16-31.615360.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_13T23_16_31.615360", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T23-16-31.615360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T23-16-31.615360.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_13T23_16_31.615360", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T23-16-31.615360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T23-16-31.615360.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_13T23_16_31.615360", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T23-16-31.615360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T23-16-31.615360.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_13T23_16_31.615360", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T23-16-31.615360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T23-16-31.615360.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_13T23_16_31.615360", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T23-16-31.615360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T23-16-31.615360.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_13T23_16_31.615360", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T23-16-31.615360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T23-16-31.615360.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_13T23_16_31.615360", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T23-16-31.615360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T23-16-31.615360.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_13T23_16_31.615360", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T23-16-31.615360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T23-16-31.615360.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_13T23_16_31.615360", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T23-16-31.615360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T23-16-31.615360.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_13T23_16_31.615360", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T23-16-31.615360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T23-16-31.615360.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_13T23_16_31.615360", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T23-16-31.615360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T23-16-31.615360.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_13T23_16_31.615360", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T23-16-31.615360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T23-16-31.615360.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_13T23_16_31.615360", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T23-16-31.615360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T23-16-31.615360.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_13T23_16_31.615360", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T23-16-31.615360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T23-16-31.615360.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_13T23_16_31.615360", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T23-16-31.615360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T23-16-31.615360.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_13T23_16_31.615360", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T23-16-31.615360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T23-16-31.615360.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_13T23_16_31.615360", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T23-16-31.615360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T23-16-31.615360.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_13T23_16_31.615360", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T23-16-31.615360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T23-16-31.615360.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_13T23_16_31.615360", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T23-16-31.615360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T23-16-31.615360.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_13T23_16_31.615360", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T23-16-31.615360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T23-16-31.615360.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_13T23_16_31.615360", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T23-16-31.615360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T23-16-31.615360.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_13T23_16_31.615360", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T23-16-31.615360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T23-16-31.615360.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_13T23_16_31.615360", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T23-16-31.615360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T23-16-31.615360.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_13T23_16_31.615360", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T23-16-31.615360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T23-16-31.615360.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_13T23_16_31.615360", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T23-16-31.615360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T23-16-31.615360.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_13T23_16_31.615360", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T23-16-31.615360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T23-16-31.615360.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_13T23_16_31.615360", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T23-16-31.615360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T23-16-31.615360.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_13T23_16_31.615360", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T23-16-31.615360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T23-16-31.615360.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_13T23_16_31.615360", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T23-16-31.615360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T23-16-31.615360.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_13T23_16_31.615360", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T23-16-31.615360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T23-16-31.615360.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_13T23_16_31.615360", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T23-16-31.615360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T23-16-31.615360.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_13T23_16_31.615360", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T23-16-31.615360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T23-16-31.615360.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_13T23_16_31.615360", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T23-16-31.615360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T23-16-31.615360.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_13T23_16_31.615360", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T23-16-31.615360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T23-16-31.615360.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_13T23_16_31.615360", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T23-16-31.615360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T23-16-31.615360.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_13T23_16_31.615360", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T23-16-31.615360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T23-16-31.615360.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_13T23_16_31.615360", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T23-16-31.615360.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T23-16-31.615360.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_13T23_16_31.615360", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T23-16-31.615360.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T23-16-31.615360.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_13T23_16_31.615360", "path": ["**/details_harness|winogrande|5_2024-01-13T23-16-31.615360.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-13T23-16-31.615360.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_13T23_16_31.615360", "path": ["results_2024-01-13T23-16-31.615360.parquet"]}, {"split": "latest", "path": ["results_2024-01-13T23-16-31.615360.parquet"]}]}]} | 2024-01-13T23:19:11+00:00 |
aa2fd412464674ac13464131425d5162ed102b5c | # FAVA Datasets
FAVA datasets include: annotation data and training data.
## Dataset Details
### Annotation Data
The annotation dataset includes 460 annotated passages identifying and editing errors using our hallucination taxonomy. This dataset was used for the fine-grained error detection task, using the annotated passages as the gold passages.
### Training Data
The training data includes 35k training instances of erroneous input and corrected output pairs using our synthetic data generation pipeline.
| fava-uw/fava-data | [
"region:us"
] | 2024-01-13T23:19:52+00:00 | {} | 2024-01-15T04:55:30+00:00 |
6f793015a4dd51d931419336b7eee482d3e60e30 |
# Dataset of t91/T91/T91 (Girls' Frontline)
This is the dataset of t91/T91/T91 (Girls' Frontline), containing 12 images and their tags.
The core tags of this character are `blue_hair, hairband, ahoge, short_hair, breasts, bangs, orange_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 12 | 12.19 MiB | [Download](https://huggingface.co/datasets/CyberHarem/t91_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 12 | 7.64 MiB | [Download](https://huggingface.co/datasets/CyberHarem/t91_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 29 | 15.60 MiB | [Download](https://huggingface.co/datasets/CyberHarem/t91_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 12 | 11.35 MiB | [Download](https://huggingface.co/datasets/CyberHarem/t91_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 29 | 20.77 MiB | [Download](https://huggingface.co/datasets/CyberHarem/t91_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/t91_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------|
| 0 | 12 |  |  |  |  |  | 1girl, solo, looking_at_viewer, white_background, simple_background, blush, cleavage, gloves, smile |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | white_background | simple_background | blush | cleavage | gloves | smile |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:-------------------|:--------------------|:--------|:-----------|:---------|:--------|
| 0 | 12 |  |  |  |  |  | X | X | X | X | X | X | X | X | X |
| CyberHarem/t91_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T23:21:12+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T23:23:44+00:00 |
be017e637bef0ad642af65b011ebfa977401d3b9 |
# Dataset of t_cms/T-CMS/T-CMS (Girls' Frontline)
This is the dataset of t_cms/T-CMS/T-CMS (Girls' Frontline), containing 15 images and their tags.
The core tags of this character are `grey_hair, long_hair, multicolored_hair, streaked_hair, bangs, hair_between_eyes, breasts, purple_eyes, very_long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 15 | 36.14 MiB | [Download](https://huggingface.co/datasets/CyberHarem/t_cms_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 15 | 14.94 MiB | [Download](https://huggingface.co/datasets/CyberHarem/t_cms_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 39 | 32.42 MiB | [Download](https://huggingface.co/datasets/CyberHarem/t_cms_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 15 | 28.83 MiB | [Download](https://huggingface.co/datasets/CyberHarem/t_cms_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 39 | 56.44 MiB | [Download](https://huggingface.co/datasets/CyberHarem/t_cms_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/t_cms_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 15 |  |  |  |  |  | 1girl, solo, looking_at_viewer, blush, jacket, fur_trim, goggles_around_neck, coat, off_shoulder, bare_shoulders, black_gloves, black_shorts, open_clothes, holding, simple_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | blush | jacket | fur_trim | goggles_around_neck | coat | off_shoulder | bare_shoulders | black_gloves | black_shorts | open_clothes | holding | simple_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:--------|:---------|:-----------|:----------------------|:-------|:---------------|:-----------------|:---------------|:---------------|:---------------|:----------|:--------------------|
| 0 | 15 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/t_cms_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T23:21:19+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T23:25:07+00:00 |
149cf722efdbcecb5f749ec889e42e47ec67a0f1 |
# Dataset of ks_23/KS-23/KS-23 (Girls' Frontline)
This is the dataset of ks_23/KS-23/KS-23 (Girls' Frontline), containing 17 images and their tags.
The core tags of this character are `breasts, orange_hair, large_breasts, yellow_eyes, ahoge, long_hair, red_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 17 | 16.46 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ks_23_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 17 | 10.37 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ks_23_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 36 | 21.51 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ks_23_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 17 | 14.75 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ks_23_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 36 | 28.43 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ks_23_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/ks_23_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 17 |  |  |  |  |  | 1girl, looking_at_viewer, fingerless_gloves, sharp_teeth, solo, cleavage, navel, simple_background, blush, midriff, white_background, shorts, black_gloves, elbow_gloves, smile |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | fingerless_gloves | sharp_teeth | solo | cleavage | navel | simple_background | blush | midriff | white_background | shorts | black_gloves | elbow_gloves | smile |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:--------------------|:--------------|:-------|:-----------|:--------|:--------------------|:--------|:----------|:-------------------|:---------|:---------------|:---------------|:--------|
| 0 | 17 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/ks_23_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T23:21:20+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T23:24:40+00:00 |
fc784922ff63c0df62657f38f3c11a33cda8a7e7 |
# Dataset of scar_l/SCAR-L (Girls' Frontline)
This is the dataset of scar_l/SCAR-L (Girls' Frontline), containing 19 images and their tags.
The core tags of this character are `bangs, blue_eyes, long_hair, blonde_hair, hat, hair_ornament, hairclip, black_headwear, brown_hair, purple_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 19 | 28.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/scar_l_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 19 | 15.32 MiB | [Download](https://huggingface.co/datasets/CyberHarem/scar_l_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 47 | 34.53 MiB | [Download](https://huggingface.co/datasets/CyberHarem/scar_l_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 19 | 25.60 MiB | [Download](https://huggingface.co/datasets/CyberHarem/scar_l_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 47 | 49.31 MiB | [Download](https://huggingface.co/datasets/CyberHarem/scar_l_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/scar_l_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 19 |  |  |  |  |  | 1girl, looking_at_viewer, solo, closed_mouth, simple_background, jacket, white_background, white_shirt, blush, holding, scarf, upper_body |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | closed_mouth | simple_background | jacket | white_background | white_shirt | blush | holding | scarf | upper_body |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:---------------|:--------------------|:---------|:-------------------|:--------------|:--------|:----------|:--------|:-------------|
| 0 | 19 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/scar_l_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T23:21:24+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T23:25:08+00:00 |
403af1f194ca204e8224a463d7c3afad799dd343 |
# Dataset of scar_h/SCAR-H (Girls' Frontline)
This is the dataset of scar_h/SCAR-H (Girls' Frontline), containing 20 images and their tags.
The core tags of this character are `bangs, long_hair, blonde_hair, blue_eyes, hat, ponytail, white_headwear, baseball_cap, breasts, brown_hair, purple_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 20 | 25.37 MiB | [Download](https://huggingface.co/datasets/CyberHarem/scar_h_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 20 | 13.97 MiB | [Download](https://huggingface.co/datasets/CyberHarem/scar_h_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 49 | 30.24 MiB | [Download](https://huggingface.co/datasets/CyberHarem/scar_h_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 20 | 22.54 MiB | [Download](https://huggingface.co/datasets/CyberHarem/scar_h_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 49 | 43.37 MiB | [Download](https://huggingface.co/datasets/CyberHarem/scar_h_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/scar_h_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | blue_gloves, 1girl, solo, assault_rifle, black_jacket, feet_out_of_frame, holding_gun, looking_at_viewer, white_background, long_sleeves, midriff, navel, pants, simple_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | blue_gloves | 1girl | solo | assault_rifle | black_jacket | feet_out_of_frame | holding_gun | looking_at_viewer | white_background | long_sleeves | midriff | navel | pants | simple_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------|:--------|:-------|:----------------|:---------------|:--------------------|:--------------|:--------------------|:-------------------|:---------------|:----------|:--------|:--------|:--------------------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/scar_h_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T23:21:37+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T23:27:54+00:00 |
fc7ed37478073bfd1f9844a12b6a83128430a672 |
# Dataset of dp28/DP28/DP28 (Girls' Frontline)
This is the dataset of dp28/DP28/DP28 (Girls' Frontline), containing 26 images and their tags.
The core tags of this character are `blonde_hair, long_hair, blue_eyes, breasts, large_breasts, hat, braid, fur_hat, white_headwear`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:---------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 26 | 41.77 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dp28_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 26 | 19.66 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dp28_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 66 | 45.18 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dp28_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 26 | 34.71 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dp28_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 66 | 72.69 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dp28_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/dp28_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 |  |  |  |  |  | 1girl, white_gloves, cleavage, solo, belt, blush, thighhighs, looking_at_viewer, black_panties, simple_background, white_background, side-tie_panties |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | white_gloves | cleavage | solo | belt | blush | thighhighs | looking_at_viewer | black_panties | simple_background | white_background | side-tie_panties |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:-----------|:-------|:-------|:--------|:-------------|:--------------------|:----------------|:--------------------|:-------------------|:-------------------|
| 0 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/dp28_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T23:21:54+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T23:29:06+00:00 |
0a2c8c7553a685a4e29616b9dd8a679deff28991 | fairnightzz/os-anki | [
"region:us"
] | 2024-01-13T23:22:13+00:00 | {} | 2024-01-13T23:32:03+00:00 |
|
dff5df95201ee76752dee4b05b1fb81642c8b38d | llm-aes/asappp-1-2-original | [
"region:us"
] | 2024-01-13T23:24:20+00:00 | {"dataset_info": {"features": [{"name": "essay_set", "dtype": "int64"}, {"name": "essay", "dtype": "string"}, {"name": "rater1_domain1", "dtype": "int64"}, {"name": "rater2_domain1", "dtype": "int64"}, {"name": "domain1_score", "dtype": "int64"}, {"name": "rubrics", "dtype": "string"}, {"name": "prompt", "dtype": "string"}, {"name": "content", "dtype": "int64"}, {"name": "organization", "dtype": "int64"}, {"name": "word_choice", "dtype": "int64"}, {"name": "sentence_fluency", "dtype": "int64"}, {"name": "conventions", "dtype": "int64"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 14489590, "num_examples": 3583}], "download_size": 4033411, "dataset_size": 14489590}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-14T02:57:26+00:00 |
|
e818ed5dfdde4c36986140d2b0e93e5a6c9d0464 | Budzisnki/voz_agro | [
"license:openrail",
"region:us"
] | 2024-01-13T23:24:22+00:00 | {"license": "openrail"} | 2024-01-13T23:56:04+00:00 |
|
6b4efa3089a1c5fa137a0fa4d6017a2c4bbda83c |
# Dataset of pennsylvania/ペンシルベニア/宾夕法尼亚 (Azur Lane)
This is the dataset of pennsylvania/ペンシルベニア/宾夕法尼亚 (Azur Lane), containing 10 images and their tags.
The core tags of this character are `long_hair, green_eyes, brown_hair, breasts, ponytail, large_breasts, hat`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 10 | 13.48 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pennsylvania_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 10 | 7.34 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pennsylvania_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 20 | 13.63 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pennsylvania_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 10 | 11.35 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pennsylvania_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 20 | 19.75 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pennsylvania_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/pennsylvania_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------|
| 0 | 10 |  |  |  |  |  | 1girl, solo, pantyhose, simple_background, white_background, black_gloves, cleavage, looking_at_viewer, blush, uniform |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | pantyhose | simple_background | white_background | black_gloves | cleavage | looking_at_viewer | blush | uniform |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:------------|:--------------------|:-------------------|:---------------|:-----------|:--------------------|:--------|:----------|
| 0 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/pennsylvania_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T23:24:58+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T23:29:04+00:00 |
3286e64ca04e49406361ceb2b25503cae956fdcd |
# Dataset of kuroshio/黒潮/黑潮 (Azur Lane)
This is the dataset of kuroshio/黒潮/黑潮 (Azur Lane), containing 10 images and their tags.
The core tags of this character are `braid, horns, red_eyes, hair_flower, hair_ornament, long_hair, twin_braids, bangs, pointy_ears, black_hair, bow, red_bow, hair_bow, sidelocks, red_ribbon`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 10 | 10.75 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kuroshio_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 10 | 6.29 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kuroshio_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 21 | 11.74 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kuroshio_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 10 | 9.34 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kuroshio_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 21 | 16.61 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kuroshio_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/kuroshio_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 |  |  |  |  |  | 1girl, solo, looking_at_viewer, black_scarf, pleated_skirt, red_thighhighs, bare_shoulders, black_skirt, obi, white_background, bridal_gauntlets, elbow_gloves, panties, simple_background, garter_straps, weapon, blush, closed_mouth, floral_print, full_body, kimono, pink_flower, shoes, white_footwear |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | black_scarf | pleated_skirt | red_thighhighs | bare_shoulders | black_skirt | obi | white_background | bridal_gauntlets | elbow_gloves | panties | simple_background | garter_straps | weapon | blush | closed_mouth | floral_print | full_body | kimono | pink_flower | shoes | white_footwear |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:--------------|:----------------|:-----------------|:-----------------|:--------------|:------|:-------------------|:-------------------|:---------------|:----------|:--------------------|:----------------|:---------|:--------|:---------------|:---------------|:------------|:---------|:--------------|:--------|:-----------------|
| 0 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/kuroshio_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T23:25:05+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T23:28:38+00:00 |
2764d196dd416beaa342814cbb7e3641e05f5b54 |
# Dataset of chicago/シカゴ/芝加哥 (Azur Lane)
This is the dataset of chicago/シカゴ/芝加哥 (Azur Lane), containing 21 images and their tags.
The core tags of this character are `breasts, drill_hair, blonde_hair, ahoge, blue_eyes, large_breasts, twin_drills, hair_between_eyes, long_hair, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 21 | 21.66 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chicago_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 21 | 14.98 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chicago_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 49 | 29.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chicago_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 21 | 19.85 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chicago_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 49 | 38.65 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chicago_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/chicago_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 21 |  |  |  |  |  | 1girl, smile, blush, cleavage, bare_shoulders, looking_at_viewer, navel, solo, black_choker, red_gloves, star_print, collarbone, midriff, elbow_gloves, criss-cross_halter, short_shorts, sitting |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | smile | blush | cleavage | bare_shoulders | looking_at_viewer | navel | solo | black_choker | red_gloves | star_print | collarbone | midriff | elbow_gloves | criss-cross_halter | short_shorts | sitting |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:--------|:-----------|:-----------------|:--------------------|:--------|:-------|:---------------|:-------------|:-------------|:-------------|:----------|:---------------|:---------------------|:---------------|:----------|
| 0 | 21 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/chicago_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T23:25:15+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T23:30:36+00:00 |
5f359fbe189c6347b2f761dfdbd32767b01b7b40 | llm-aes/doc-storygen-v2 | [
"region:us"
] | 2024-01-13T23:29:35+00:00 | {"dataset_info": {"features": [{"name": "worker_id", "dtype": "string"}, {"name": "task_id", "dtype": "string"}, {"name": "task_response_id", "dtype": "string"}, {"name": "id", "dtype": "int64"}, {"name": "premise", "dtype": "string"}, {"name": "plan1", "dtype": "string"}, {"name": "plan2", "dtype": "string"}, {"name": "Q1", "dtype": "string"}, {"name": "Q2", "dtype": "string"}, {"name": "Q3", "dtype": "string"}, {"name": "Q4", "dtype": "string"}, {"name": "Q5", "dtype": "string"}, {"name": "Q6", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 60995214, "num_examples": 7000}], "download_size": 28333525, "dataset_size": 60995214}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-15T03:07:21+00:00 |
|
8c9352dc9a82af478027cd320d0eae0642bed2fd | llm-aes/hanna | [
"region:us"
] | 2024-01-13T23:31:49+00:00 | {"dataset_info": {"features": [{"name": "Story_ID", "dtype": "int64"}, {"name": "Prompt", "dtype": "string"}, {"name": "Human", "dtype": "string"}, {"name": "Story", "dtype": "string"}, {"name": "Model", "dtype": "string"}, {"name": "Relevance", "dtype": "int64"}, {"name": "Coherence", "dtype": "int64"}, {"name": "Empathy", "dtype": "int64"}, {"name": "Surprise", "dtype": "int64"}, {"name": "Engagement", "dtype": "int64"}, {"name": "Complexity", "dtype": "int64"}, {"name": "Worker_ID", "dtype": "string"}, {"name": "Assignment_ID", "dtype": "string"}, {"name": "Work_time_in_seconds", "dtype": "float64"}, {"name": "Name", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 13401106, "num_examples": 3168}], "download_size": 1721485, "dataset_size": 13401106}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-14T08:21:47+00:00 |
|
a2cc15f2341877215be896763625976350e32dff |
# Dataset Card for Evaluation run of Pierre-obi/Mistral_solar-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Pierre-obi/Mistral_solar-slerp](https://huggingface.co/Pierre-obi/Mistral_solar-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Pierre-obi__Mistral_solar-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T23:33:11.418111](https://huggingface.co/datasets/open-llm-leaderboard/details_Pierre-obi__Mistral_solar-slerp/blob/main/results_2024-01-13T23-33-11.418111.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.40347501414405273,
"acc_stderr": 0.03383375290012146,
"acc_norm": 0.40822900373379084,
"acc_norm_stderr": 0.03472416283155831,
"mc1": 0.2876376988984088,
"mc1_stderr": 0.015846315101394802,
"mc2": 0.46956525596934184,
"mc2_stderr": 0.015501210721813442
},
"harness|arc:challenge|25": {
"acc": 0.4044368600682594,
"acc_stderr": 0.014342036483436174,
"acc_norm": 0.4300341296928328,
"acc_norm_stderr": 0.014467631559137994
},
"harness|hellaswag|10": {
"acc": 0.4433379804819757,
"acc_stderr": 0.004957637648426472,
"acc_norm": 0.5792670782712607,
"acc_norm_stderr": 0.004926678108601339
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.04244633238353228,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.04244633238353228
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3881578947368421,
"acc_stderr": 0.03965842097512744,
"acc_norm": 0.3881578947368421,
"acc_norm_stderr": 0.03965842097512744
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4226415094339623,
"acc_stderr": 0.030402331445769537,
"acc_norm": 0.4226415094339623,
"acc_norm_stderr": 0.030402331445769537
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3472222222222222,
"acc_stderr": 0.039812405437178615,
"acc_norm": 0.3472222222222222,
"acc_norm_stderr": 0.039812405437178615
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3583815028901734,
"acc_stderr": 0.036563436533531585,
"acc_norm": 0.3583815028901734,
"acc_norm_stderr": 0.036563436533531585
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.04488482852329017,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.04488482852329017
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4297872340425532,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.4297872340425532,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.04372748290278007,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.04372748290278007
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.45517241379310347,
"acc_stderr": 0.04149886942192118,
"acc_norm": 0.45517241379310347,
"acc_norm_stderr": 0.04149886942192118
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3201058201058201,
"acc_stderr": 0.0240268463928735,
"acc_norm": 0.3201058201058201,
"acc_norm_stderr": 0.0240268463928735
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23015873015873015,
"acc_stderr": 0.037649508797906066,
"acc_norm": 0.23015873015873015,
"acc_norm_stderr": 0.037649508797906066
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2064516129032258,
"acc_stderr": 0.023025899617188726,
"acc_norm": 0.2064516129032258,
"acc_norm_stderr": 0.023025899617188726
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3448275862068966,
"acc_stderr": 0.03344283744280458,
"acc_norm": 0.3448275862068966,
"acc_norm_stderr": 0.03344283744280458
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.296969696969697,
"acc_stderr": 0.0356796977226805,
"acc_norm": 0.296969696969697,
"acc_norm_stderr": 0.0356796977226805
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.46464646464646464,
"acc_stderr": 0.03553436368828063,
"acc_norm": 0.46464646464646464,
"acc_norm_stderr": 0.03553436368828063
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6476683937823834,
"acc_stderr": 0.03447478286414357,
"acc_norm": 0.6476683937823834,
"acc_norm_stderr": 0.03447478286414357
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.441025641025641,
"acc_stderr": 0.025174048384000756,
"acc_norm": 0.441025641025641,
"acc_norm_stderr": 0.025174048384000756
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24814814814814815,
"acc_stderr": 0.026335739404055803,
"acc_norm": 0.24814814814814815,
"acc_norm_stderr": 0.026335739404055803
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.42016806722689076,
"acc_stderr": 0.03206183783236153,
"acc_norm": 0.42016806722689076,
"acc_norm_stderr": 0.03206183783236153
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.43119266055045874,
"acc_stderr": 0.021233365030319563,
"acc_norm": 0.43119266055045874,
"acc_norm_stderr": 0.021233365030319563
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2638888888888889,
"acc_stderr": 0.030058202704309846,
"acc_norm": 0.2638888888888889,
"acc_norm_stderr": 0.030058202704309846
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.31862745098039214,
"acc_stderr": 0.0327028718148208,
"acc_norm": 0.31862745098039214,
"acc_norm_stderr": 0.0327028718148208
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.459915611814346,
"acc_stderr": 0.03244246810187914,
"acc_norm": 0.459915611814346,
"acc_norm_stderr": 0.03244246810187914
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5381165919282511,
"acc_stderr": 0.03346015011973228,
"acc_norm": 0.5381165919282511,
"acc_norm_stderr": 0.03346015011973228
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5114503816793893,
"acc_stderr": 0.04384140024078016,
"acc_norm": 0.5114503816793893,
"acc_norm_stderr": 0.04384140024078016
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6611570247933884,
"acc_stderr": 0.043207678075366705,
"acc_norm": 0.6611570247933884,
"acc_norm_stderr": 0.043207678075366705
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.04812917324536823,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.04812917324536823
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4233128834355828,
"acc_stderr": 0.038818912133343826,
"acc_norm": 0.4233128834355828,
"acc_norm_stderr": 0.038818912133343826
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.375,
"acc_stderr": 0.04595091388086298,
"acc_norm": 0.375,
"acc_norm_stderr": 0.04595091388086298
},
"harness|hendrycksTest-management|5": {
"acc": 0.5631067961165048,
"acc_stderr": 0.049111471073657764,
"acc_norm": 0.5631067961165048,
"acc_norm_stderr": 0.049111471073657764
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7094017094017094,
"acc_stderr": 0.029745048572674064,
"acc_norm": 0.7094017094017094,
"acc_norm_stderr": 0.029745048572674064
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.51213282247765,
"acc_stderr": 0.017874698667491338,
"acc_norm": 0.51213282247765,
"acc_norm_stderr": 0.017874698667491338
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5664739884393064,
"acc_stderr": 0.026680134761679214,
"acc_norm": 0.5664739884393064,
"acc_norm_stderr": 0.026680134761679214
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23910614525139665,
"acc_stderr": 0.014265554192331146,
"acc_norm": 0.23910614525139665,
"acc_norm_stderr": 0.014265554192331146
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4150326797385621,
"acc_stderr": 0.028213504177824093,
"acc_norm": 0.4150326797385621,
"acc_norm_stderr": 0.028213504177824093
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.4919614147909968,
"acc_stderr": 0.028394421370984545,
"acc_norm": 0.4919614147909968,
"acc_norm_stderr": 0.028394421370984545
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.39197530864197533,
"acc_stderr": 0.027163686038271233,
"acc_norm": 0.39197530864197533,
"acc_norm_stderr": 0.027163686038271233
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3120567375886525,
"acc_stderr": 0.02764012054516993,
"acc_norm": 0.3120567375886525,
"acc_norm_stderr": 0.02764012054516993
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2966101694915254,
"acc_stderr": 0.011665946586082854,
"acc_norm": 0.2966101694915254,
"acc_norm_stderr": 0.011665946586082854
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.19852941176470587,
"acc_stderr": 0.024231013370541104,
"acc_norm": 0.19852941176470587,
"acc_norm_stderr": 0.024231013370541104
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.3839869281045752,
"acc_stderr": 0.01967580813528152,
"acc_norm": 0.3839869281045752,
"acc_norm_stderr": 0.01967580813528152
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5454545454545454,
"acc_stderr": 0.04769300568972745,
"acc_norm": 0.5454545454545454,
"acc_norm_stderr": 0.04769300568972745
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3795918367346939,
"acc_stderr": 0.031067211262872495,
"acc_norm": 0.3795918367346939,
"acc_norm_stderr": 0.031067211262872495
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.30845771144278605,
"acc_stderr": 0.03265819588512699,
"acc_norm": 0.30845771144278605,
"acc_norm_stderr": 0.03265819588512699
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-virology|5": {
"acc": 0.40963855421686746,
"acc_stderr": 0.038284011150790206,
"acc_norm": 0.40963855421686746,
"acc_norm_stderr": 0.038284011150790206
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.49707602339181284,
"acc_stderr": 0.03834759370936839,
"acc_norm": 0.49707602339181284,
"acc_norm_stderr": 0.03834759370936839
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2876376988984088,
"mc1_stderr": 0.015846315101394802,
"mc2": 0.46956525596934184,
"mc2_stderr": 0.015501210721813442
},
"harness|winogrande|5": {
"acc": 0.6819258089976322,
"acc_stderr": 0.013089285079884678
},
"harness|gsm8k|5": {
"acc": 0.006065200909780136,
"acc_stderr": 0.0021386703014604777
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Pierre-obi__Mistral_solar-slerp | [
"region:us"
] | 2024-01-13T23:35:29+00:00 | {"pretty_name": "Evaluation run of Pierre-obi/Mistral_solar-slerp", "dataset_summary": "Dataset automatically created during the evaluation run of model [Pierre-obi/Mistral_solar-slerp](https://huggingface.co/Pierre-obi/Mistral_solar-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Pierre-obi__Mistral_solar-slerp\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-13T23:33:11.418111](https://huggingface.co/datasets/open-llm-leaderboard/details_Pierre-obi__Mistral_solar-slerp/blob/main/results_2024-01-13T23-33-11.418111.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.40347501414405273,\n \"acc_stderr\": 0.03383375290012146,\n \"acc_norm\": 0.40822900373379084,\n \"acc_norm_stderr\": 0.03472416283155831,\n \"mc1\": 0.2876376988984088,\n \"mc1_stderr\": 0.015846315101394802,\n \"mc2\": 0.46956525596934184,\n \"mc2_stderr\": 0.015501210721813442\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.4044368600682594,\n \"acc_stderr\": 0.014342036483436174,\n \"acc_norm\": 0.4300341296928328,\n \"acc_norm_stderr\": 0.014467631559137994\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4433379804819757,\n \"acc_stderr\": 0.004957637648426472,\n \"acc_norm\": 0.5792670782712607,\n \"acc_norm_stderr\": 0.004926678108601339\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4074074074074074,\n \"acc_stderr\": 0.04244633238353228,\n \"acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.04244633238353228\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.3881578947368421,\n \"acc_stderr\": 0.03965842097512744,\n \"acc_norm\": 0.3881578947368421,\n \"acc_norm_stderr\": 0.03965842097512744\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.4226415094339623,\n \"acc_stderr\": 0.030402331445769537,\n \"acc_norm\": 0.4226415094339623,\n \"acc_norm_stderr\": 0.030402331445769537\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3472222222222222,\n \"acc_stderr\": 0.039812405437178615,\n \"acc_norm\": 0.3472222222222222,\n \"acc_norm_stderr\": 0.039812405437178615\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3583815028901734,\n \"acc_stderr\": 0.036563436533531585,\n \"acc_norm\": 0.3583815028901734,\n \"acc_norm_stderr\": 0.036563436533531585\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.04488482852329017,\n \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.04488482852329017\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4297872340425532,\n \"acc_stderr\": 0.03236214467715564,\n \"acc_norm\": 0.4297872340425532,\n \"acc_norm_stderr\": 0.03236214467715564\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3157894736842105,\n \"acc_stderr\": 0.04372748290278007,\n \"acc_norm\": 0.3157894736842105,\n \"acc_norm_stderr\": 0.04372748290278007\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.45517241379310347,\n \"acc_stderr\": 0.04149886942192118,\n \"acc_norm\": 0.45517241379310347,\n \"acc_norm_stderr\": 0.04149886942192118\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3201058201058201,\n \"acc_stderr\": 0.0240268463928735,\n \"acc_norm\": 0.3201058201058201,\n \"acc_norm_stderr\": 0.0240268463928735\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23015873015873015,\n \"acc_stderr\": 0.037649508797906066,\n \"acc_norm\": 0.23015873015873015,\n \"acc_norm_stderr\": 0.037649508797906066\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2064516129032258,\n \"acc_stderr\": 0.023025899617188726,\n \"acc_norm\": 0.2064516129032258,\n \"acc_norm_stderr\": 0.023025899617188726\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3448275862068966,\n \"acc_stderr\": 0.03344283744280458,\n \"acc_norm\": 0.3448275862068966,\n \"acc_norm_stderr\": 0.03344283744280458\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.296969696969697,\n \"acc_stderr\": 0.0356796977226805,\n \"acc_norm\": 0.296969696969697,\n \"acc_norm_stderr\": 0.0356796977226805\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.46464646464646464,\n \"acc_stderr\": 0.03553436368828063,\n \"acc_norm\": 0.46464646464646464,\n \"acc_norm_stderr\": 0.03553436368828063\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.6476683937823834,\n \"acc_stderr\": 0.03447478286414357,\n \"acc_norm\": 0.6476683937823834,\n \"acc_norm_stderr\": 0.03447478286414357\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.441025641025641,\n \"acc_stderr\": 0.025174048384000756,\n \"acc_norm\": 0.441025641025641,\n \"acc_norm_stderr\": 0.025174048384000756\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.24814814814814815,\n \"acc_stderr\": 0.026335739404055803,\n \"acc_norm\": 0.24814814814814815,\n \"acc_norm_stderr\": 0.026335739404055803\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.42016806722689076,\n \"acc_stderr\": 0.03206183783236153,\n \"acc_norm\": 0.42016806722689076,\n \"acc_norm_stderr\": 0.03206183783236153\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.43119266055045874,\n \"acc_stderr\": 0.021233365030319563,\n \"acc_norm\": 0.43119266055045874,\n \"acc_norm_stderr\": 0.021233365030319563\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.2638888888888889,\n \"acc_stderr\": 0.030058202704309846,\n \"acc_norm\": 0.2638888888888889,\n \"acc_norm_stderr\": 0.030058202704309846\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.31862745098039214,\n \"acc_stderr\": 0.0327028718148208,\n \"acc_norm\": 0.31862745098039214,\n \"acc_norm_stderr\": 0.0327028718148208\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.459915611814346,\n \"acc_stderr\": 0.03244246810187914,\n \"acc_norm\": 0.459915611814346,\n \"acc_norm_stderr\": 0.03244246810187914\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5381165919282511,\n \"acc_stderr\": 0.03346015011973228,\n \"acc_norm\": 0.5381165919282511,\n \"acc_norm_stderr\": 0.03346015011973228\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5114503816793893,\n \"acc_stderr\": 0.04384140024078016,\n \"acc_norm\": 0.5114503816793893,\n \"acc_norm_stderr\": 0.04384140024078016\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6611570247933884,\n \"acc_stderr\": 0.043207678075366705,\n \"acc_norm\": 0.6611570247933884,\n \"acc_norm_stderr\": 0.043207678075366705\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5462962962962963,\n \"acc_stderr\": 0.04812917324536823,\n \"acc_norm\": 0.5462962962962963,\n \"acc_norm_stderr\": 0.04812917324536823\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.4233128834355828,\n \"acc_stderr\": 0.038818912133343826,\n \"acc_norm\": 0.4233128834355828,\n \"acc_norm_stderr\": 0.038818912133343826\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.375,\n \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.375,\n \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.5631067961165048,\n \"acc_stderr\": 0.049111471073657764,\n \"acc_norm\": 0.5631067961165048,\n \"acc_norm_stderr\": 0.049111471073657764\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7094017094017094,\n \"acc_stderr\": 0.029745048572674064,\n \"acc_norm\": 0.7094017094017094,\n \"acc_norm_stderr\": 0.029745048572674064\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.51213282247765,\n \"acc_stderr\": 0.017874698667491338,\n \"acc_norm\": 0.51213282247765,\n \"acc_norm_stderr\": 0.017874698667491338\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5664739884393064,\n \"acc_stderr\": 0.026680134761679214,\n \"acc_norm\": 0.5664739884393064,\n \"acc_norm_stderr\": 0.026680134761679214\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23910614525139665,\n \"acc_stderr\": 0.014265554192331146,\n \"acc_norm\": 0.23910614525139665,\n \"acc_norm_stderr\": 0.014265554192331146\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.4150326797385621,\n \"acc_stderr\": 0.028213504177824093,\n \"acc_norm\": 0.4150326797385621,\n \"acc_norm_stderr\": 0.028213504177824093\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.4919614147909968,\n \"acc_stderr\": 0.028394421370984545,\n \"acc_norm\": 0.4919614147909968,\n \"acc_norm_stderr\": 0.028394421370984545\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.39197530864197533,\n \"acc_stderr\": 0.027163686038271233,\n \"acc_norm\": 0.39197530864197533,\n \"acc_norm_stderr\": 0.027163686038271233\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3120567375886525,\n \"acc_stderr\": 0.02764012054516993,\n \"acc_norm\": 0.3120567375886525,\n \"acc_norm_stderr\": 0.02764012054516993\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2966101694915254,\n \"acc_stderr\": 0.011665946586082854,\n \"acc_norm\": 0.2966101694915254,\n \"acc_norm_stderr\": 0.011665946586082854\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.19852941176470587,\n \"acc_stderr\": 0.024231013370541104,\n \"acc_norm\": 0.19852941176470587,\n \"acc_norm_stderr\": 0.024231013370541104\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.3839869281045752,\n \"acc_stderr\": 0.01967580813528152,\n \"acc_norm\": 0.3839869281045752,\n \"acc_norm_stderr\": 0.01967580813528152\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5454545454545454,\n \"acc_stderr\": 0.04769300568972745,\n \"acc_norm\": 0.5454545454545454,\n \"acc_norm_stderr\": 0.04769300568972745\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.3795918367346939,\n \"acc_stderr\": 0.031067211262872495,\n \"acc_norm\": 0.3795918367346939,\n \"acc_norm_stderr\": 0.031067211262872495\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.30845771144278605,\n \"acc_stderr\": 0.03265819588512699,\n \"acc_norm\": 0.30845771144278605,\n \"acc_norm_stderr\": 0.03265819588512699\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.40963855421686746,\n \"acc_stderr\": 0.038284011150790206,\n \"acc_norm\": 0.40963855421686746,\n \"acc_norm_stderr\": 0.038284011150790206\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.49707602339181284,\n \"acc_stderr\": 0.03834759370936839,\n \"acc_norm\": 0.49707602339181284,\n \"acc_norm_stderr\": 0.03834759370936839\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2876376988984088,\n \"mc1_stderr\": 0.015846315101394802,\n \"mc2\": 0.46956525596934184,\n \"mc2_stderr\": 0.015501210721813442\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6819258089976322,\n \"acc_stderr\": 0.013089285079884678\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.006065200909780136,\n \"acc_stderr\": 0.0021386703014604777\n }\n}\n```", "repo_url": "https://huggingface.co/Pierre-obi/Mistral_solar-slerp", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|arc:challenge|25_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|gsm8k|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hellaswag|10_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|winogrande|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["results_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["results_2024-01-13T23-33-11.418111.parquet"]}]}]} | 2024-01-13T23:35:50+00:00 |
756a8e32ec189e4df67038c92aef537b00d428f3 |
A further augmented and modified version of [Augmental-Dataset](https://huggingface.co/datasets/Heralax/Augmental-Dataset) for Steins;Gate-themed RP in Fastchat format, modified in the following ways:
- The first prompt is modified to add context and simple references to aspects of the conversation (OOC, use of emojis, content), scenario setup, character introductions.
- All split conversations were joined.
- The assistant always plays only a single character (chosen to be the character with the maximum number of lines who is not the first speaker). All other characters are assigned to the user. This is described precisely in the first prompt.
- Conversations alternate between user and assistant, with the first prompt always being from the user, and the last always being from the assistant.
| grimulkan/Augmental-Stenisgate-Augmented | [
"license:unknown",
"region:us"
] | 2024-01-13T23:37:29+00:00 | {"license": "unknown"} | 2024-01-13T23:45:27+00:00 |
643aefcdb871e216eea3bc827ac7b6e00e4d2f79 |
# Dataset of scw/SCW/SCW (Girls' Frontline)
This is the dataset of scw/SCW/SCW (Girls' Frontline), containing 14 images and their tags.
The core tags of this character are `blonde_hair, short_hair, red_eyes, headphones, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 14 | 16.38 MiB | [Download](https://huggingface.co/datasets/CyberHarem/scw_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 14 | 11.30 MiB | [Download](https://huggingface.co/datasets/CyberHarem/scw_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 30 | 20.24 MiB | [Download](https://huggingface.co/datasets/CyberHarem/scw_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 14 | 16.07 MiB | [Download](https://huggingface.co/datasets/CyberHarem/scw_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 30 | 25.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/scw_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/scw_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 14 |  |  |  |  |  | 1girl, solo, jacket, gloves, looking_at_viewer, smile, assault_rifle, armband, boots, holding_gun, single_thighhigh, socks, uneven_legwear, bag, eagle, full_body, headset, red_scarf |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | jacket | gloves | looking_at_viewer | smile | assault_rifle | armband | boots | holding_gun | single_thighhigh | socks | uneven_legwear | bag | eagle | full_body | headset | red_scarf |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:---------|:---------|:--------------------|:--------|:----------------|:----------|:--------|:--------------|:-------------------|:--------|:-----------------|:------|:--------|:------------|:----------|:------------|
| 0 | 14 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/scw_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T23:43:27+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T23:46:19+00:00 |
c70a746a039c4cdf15a9f7b8bdc1ce3ce295d7e8 |
# Dataset of m9/M9/M9 (Girls' Frontline)
This is the dataset of m9/M9/M9 (Girls' Frontline), containing 12 images and their tags.
The core tags of this character are `blonde_hair, long_hair, red_eyes, hairband, fang, very_long_hair, breasts, hair_ornament`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 12 | 10.96 MiB | [Download](https://huggingface.co/datasets/CyberHarem/m9_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 12 | 7.75 MiB | [Download](https://huggingface.co/datasets/CyberHarem/m9_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 26 | 15.44 MiB | [Download](https://huggingface.co/datasets/CyberHarem/m9_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 12 | 10.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/m9_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 26 | 19.24 MiB | [Download](https://huggingface.co/datasets/CyberHarem/m9_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/m9_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------|
| 0 | 12 |  |  |  |  |  | 1girl, solo, smile, looking_at_viewer, open_mouth, detached_sleeves, handgun, bare_shoulders, blush, red_dress, black_pantyhose |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | smile | looking_at_viewer | open_mouth | detached_sleeves | handgun | bare_shoulders | blush | red_dress | black_pantyhose |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:--------------------|:-------------|:-------------------|:----------|:-----------------|:--------|:------------|:------------------|
| 0 | 12 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/m9_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T23:43:28+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T23:46:08+00:00 |
7326c3933956173c6c0c150d218665a77b33d9a4 |
# Dataset of m500/M500/M500 (Girls' Frontline)
This is the dataset of m500/M500/M500 (Girls' Frontline), containing 30 images and their tags.
The core tags of this character are `animal_ears, blonde_hair, long_hair, blue_eyes, breasts, goggles_on_head, large_breasts, tail, bangs, fang`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:---------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 30 | 29.38 MiB | [Download](https://huggingface.co/datasets/CyberHarem/m500_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 30 | 19.44 MiB | [Download](https://huggingface.co/datasets/CyberHarem/m500_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 65 | 36.79 MiB | [Download](https://huggingface.co/datasets/CyberHarem/m500_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 30 | 26.80 MiB | [Download](https://huggingface.co/datasets/CyberHarem/m500_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 65 | 48.75 MiB | [Download](https://huggingface.co/datasets/CyberHarem/m500_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/m500_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------|
| 0 | 30 |  |  |  |  |  | 1girl, solo, smile, open_mouth, shirt, cleavage, holding, goggles, shorts, blush, gloves, looking_at_viewer |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | smile | open_mouth | shirt | cleavage | holding | goggles | shorts | blush | gloves | looking_at_viewer |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:-------------|:--------|:-----------|:----------|:----------|:---------|:--------|:---------|:--------------------|
| 0 | 30 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/m500_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T23:43:30+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T23:49:08+00:00 |
95707712060be389f0decc608b14b900517cc8a5 |
# Dataset of mg3/MG3/MG3 (Girls' Frontline)
This is the dataset of mg3/MG3/MG3 (Girls' Frontline), containing 13 images and their tags.
The core tags of this character are `blonde_hair, blue_eyes, breasts, large_breasts, long_hair, braid, single_braid, bangs, hair_between_eyes, hair_ornament, hairclip, hat`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 13 | 19.13 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mg3_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 13 | 11.01 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mg3_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 31 | 20.55 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mg3_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 13 | 17.24 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mg3_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 31 | 28.95 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mg3_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/mg3_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------|
| 0 | 13 |  |  |  |  |  | 1girl, solo, looking_at_viewer, blush, black_pantyhose, sweater, boots, cleavage, full_body, gun, necklace, off_shoulder |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | blush | black_pantyhose | sweater | boots | cleavage | full_body | gun | necklace | off_shoulder |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:--------|:------------------|:----------|:--------|:-----------|:------------|:------|:-----------|:---------------|
| 0 | 13 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/mg3_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T23:43:35+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T23:47:50+00:00 |
a48227d0965e7b85af25500f23122959c70ed0bc |
# Dataset of galil/ガリル/加利尔 (Girls' Frontline)
This is the dataset of galil/ガリル/加利尔 (Girls' Frontline), containing 10 images and their tags.
The core tags of this character are `long_hair, ahoge, brown_hair, brown_eyes, blonde_hair, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 10 | 9.52 MiB | [Download](https://huggingface.co/datasets/CyberHarem/galil_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 10 | 6.47 MiB | [Download](https://huggingface.co/datasets/CyberHarem/galil_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 23 | 12.32 MiB | [Download](https://huggingface.co/datasets/CyberHarem/galil_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 10 | 8.71 MiB | [Download](https://huggingface.co/datasets/CyberHarem/galil_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 23 | 15.67 MiB | [Download](https://huggingface.co/datasets/CyberHarem/galil_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/galil_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 |  |  |  |  |  | 1girl, solo, looking_at_viewer, simple_background, skirt, white_background, assault_rifle, holding_weapon, jacket, military_uniform, necklace, pantyhose, smile |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | simple_background | skirt | white_background | assault_rifle | holding_weapon | jacket | military_uniform | necklace | pantyhose | smile |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:--------------------|:--------|:-------------------|:----------------|:-----------------|:---------|:-------------------|:-----------|:------------|:--------|
| 0 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/galil_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T23:43:45+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T23:47:04+00:00 |
5e4e072dcd9d13940d5316c690ed133ea00fa3f6 | MatsuoDochiai/Took1 | [
"license:openrail",
"region:us"
] | 2024-01-13T23:45:36+00:00 | {"license": "openrail"} | 2024-01-13T23:47:50+00:00 |
|
9496bbcd0832d60c1882ba7d03ab8772f15e85dd |
# Dataset of leonardo_da_vinci/レオナルド・ダ・ヴィンチ/莱昂纳多·达·芬奇 (Azur Lane)
This is the dataset of leonardo_da_vinci/レオナルド・ダ・ヴィンチ/莱昂纳多·达·芬奇 (Azur Lane), containing 56 images and their tags.
The core tags of this character are `blonde_hair, long_hair, twintails, breasts, bangs, goggles_on_head, small_breasts, orange_eyes, red_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 56 | 77.42 MiB | [Download](https://huggingface.co/datasets/CyberHarem/leonardo_da_vinci_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 56 | 43.63 MiB | [Download](https://huggingface.co/datasets/CyberHarem/leonardo_da_vinci_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 139 | 90.50 MiB | [Download](https://huggingface.co/datasets/CyberHarem/leonardo_da_vinci_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 56 | 67.74 MiB | [Download](https://huggingface.co/datasets/CyberHarem/leonardo_da_vinci_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 139 | 127.40 MiB | [Download](https://huggingface.co/datasets/CyberHarem/leonardo_da_vinci_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/leonardo_da_vinci_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 32 |  |  |  |  |  | 1girl, solo, goggles, looking_at_viewer, smile, bare_shoulders, navel, blush, thighhighs, simple_background, off_shoulder, zipper_pull_tab, white_background, white_coat, open_coat, open_mouth, thigh_strap, thighs, highleg_swimsuit, long_sleeves |
| 1 | 10 |  |  |  |  |  | 1girl, solo, earrings, hair_flower, looking_at_viewer, navel, tiara, wings, bare_shoulders, medium_breasts, smile, ballerina, collarbone, red_dress, tutu, ballet_slippers, white_pantyhose, closed_mouth, full_body, red_footwear, red_rose, standing |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | goggles | looking_at_viewer | smile | bare_shoulders | navel | blush | thighhighs | simple_background | off_shoulder | zipper_pull_tab | white_background | white_coat | open_coat | open_mouth | thigh_strap | thighs | highleg_swimsuit | long_sleeves | earrings | hair_flower | tiara | wings | medium_breasts | ballerina | collarbone | red_dress | tutu | ballet_slippers | white_pantyhose | closed_mouth | full_body | red_footwear | red_rose | standing |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:----------|:--------------------|:--------|:-----------------|:--------|:--------|:-------------|:--------------------|:---------------|:------------------|:-------------------|:-------------|:------------|:-------------|:--------------|:---------|:-------------------|:---------------|:-----------|:--------------|:--------|:--------|:-----------------|:------------|:-------------|:------------|:-------|:------------------|:------------------|:---------------|:------------|:---------------|:-----------|:-----------|
| 0 | 32 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 1 | 10 |  |  |  |  |  | X | X | | X | X | X | X | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/leonardo_da_vinci_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T23:47:36+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T23:59:57+00:00 |
a172360f32cb20c1086a1fe4e46681c70e128f40 |
# Dataset of attilio_regolo/アッティリオ・レゴロ/阿蒂利奥·雷戈洛 (Azur Lane)
This is the dataset of attilio_regolo/アッティリオ・レゴロ/阿蒂利奥·雷戈洛 (Azur Lane), containing 29 images and their tags.
The core tags of this character are `long_hair, purple_eyes, bangs, ahoge, twintails, blonde_hair, very_long_hair, hair_between_eyes, bow, ribbon, hair_ornament, breasts, fang, symbol-shaped_pupils`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 29 | 46.25 MiB | [Download](https://huggingface.co/datasets/CyberHarem/attilio_regolo_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 29 | 22.28 MiB | [Download](https://huggingface.co/datasets/CyberHarem/attilio_regolo_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 65 | 47.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/attilio_regolo_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 29 | 38.24 MiB | [Download](https://huggingface.co/datasets/CyberHarem/attilio_regolo_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 65 | 75.26 MiB | [Download](https://huggingface.co/datasets/CyberHarem/attilio_regolo_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/attilio_regolo_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 29 |  |  |  |  |  | 1girl, solo, bare_shoulders, blush, looking_at_viewer, open_mouth, long_sleeves, dress, heart, collarbone, underwear, detached_sleeves, :d, sitting, halterneck |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | bare_shoulders | blush | looking_at_viewer | open_mouth | long_sleeves | dress | heart | collarbone | underwear | detached_sleeves | :d | sitting | halterneck |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:-----------------|:--------|:--------------------|:-------------|:---------------|:--------|:--------|:-------------|:------------|:-------------------|:-----|:----------|:-------------|
| 0 | 29 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/attilio_regolo_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T23:47:37+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T23:54:06+00:00 |
7f42c08dba1c7d814efc36356d14e41d500e3c9f |
# Dataset of kagero/陽炎/阳炎 (Azur Lane)
This is the dataset of kagero/陽炎/阳炎 (Azur Lane), containing 13 images and their tags.
The core tags of this character are `animal_ears, brown_hair, purple_eyes, twintails, bangs, fang, fox_ears, rabbit_ears, short_hair, ribbon`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 13 | 9.74 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kagero_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 13 | 7.63 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kagero_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 21 | 12.00 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kagero_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 13 | 9.27 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kagero_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 21 | 14.40 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kagero_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/kagero_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 13 |  |  |  |  |  | looking_at_viewer, 1girl, solo, bare_shoulders, blush, detached_sleeves, open_mouth, wide_sleeves, collarbone, simple_background, :d, full_body, long_sleeves, machinery, turret, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | looking_at_viewer | 1girl | solo | bare_shoulders | blush | detached_sleeves | open_mouth | wide_sleeves | collarbone | simple_background | :d | full_body | long_sleeves | machinery | turret | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------|:--------|:-------|:-----------------|:--------|:-------------------|:-------------|:---------------|:-------------|:--------------------|:-----|:------------|:---------------|:------------|:---------|:-------------------|
| 0 | 13 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/kagero_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T23:47:38+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T23:51:08+00:00 |
aed85d45e538761d404495924484b01f22373b70 |
# Dataset of flandre/フランドル/弗兰德尔 (Azur Lane)
This is the dataset of flandre/フランドル/弗兰德尔 (Azur Lane), containing 42 images and their tags.
The core tags of this character are `long_hair, bangs, white_hair, twintails, purple_eyes, breasts, hat, small_breasts, bow, ribbon, grey_eyes, low_twintails`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 42 | 83.12 MiB | [Download](https://huggingface.co/datasets/CyberHarem/flandre_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 42 | 37.51 MiB | [Download](https://huggingface.co/datasets/CyberHarem/flandre_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 110 | 87.91 MiB | [Download](https://huggingface.co/datasets/CyberHarem/flandre_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 42 | 68.94 MiB | [Download](https://huggingface.co/datasets/CyberHarem/flandre_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 110 | 139.22 MiB | [Download](https://huggingface.co/datasets/CyberHarem/flandre_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/flandre_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, black_thighhighs, garter_straps, long_sleeves, looking_at_viewer, solo, white_leotard, blush, grey_hair, thighs, closed_mouth, hair_ornament, smile, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_thighhighs | garter_straps | long_sleeves | looking_at_viewer | solo | white_leotard | blush | grey_hair | thighs | closed_mouth | hair_ornament | smile | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------------------|:----------------|:---------------|:--------------------|:-------|:----------------|:--------|:------------|:---------|:---------------|:----------------|:--------|:-------------------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/flandre_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T23:47:40+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T00:00:57+00:00 |
2e1ac4c00e6eec9e66f727725de5c35b8c057cec |
# Dataset of michishio/満潮/满潮 (Azur Lane)
This is the dataset of michishio/満潮/满潮 (Azur Lane), containing 23 images and their tags.
The core tags of this character are `animal_ears, cat_ears, bangs, animal_ear_fluff, breasts, brown_hair, long_hair, ahoge, brown_eyes, braid, cat_girl, large_breasts, hair_between_eyes, cat_tail, medium_breasts, ribbon, tail`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 23 | 24.72 MiB | [Download](https://huggingface.co/datasets/CyberHarem/michishio_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 23 | 17.72 MiB | [Download](https://huggingface.co/datasets/CyberHarem/michishio_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 51 | 34.85 MiB | [Download](https://huggingface.co/datasets/CyberHarem/michishio_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 23 | 23.28 MiB | [Download](https://huggingface.co/datasets/CyberHarem/michishio_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 51 | 44.32 MiB | [Download](https://huggingface.co/datasets/CyberHarem/michishio_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/michishio_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, balloon, detached_sleeves, looking_at_viewer, open_mouth, solo, :d, pink_dress, puffy_short_sleeves, frills, full_body, high_heels, pink_footwear, white_background, white_thighhighs, bare_shoulders, blush, bow, cleavage_cutout, hair_rings, jingle_bell, petals, simple_background, standing_on_one_leg, tiara, very_long_hair, virtual_youtuber |
| 1 | 15 |  |  |  |  |  | blush, jingle_bell, :d, neck_bell, open_mouth, kimono, long_sleeves, looking_at_viewer, red_skirt, 2girls, bare_shoulders, wide_sleeves, off_shoulder, pleated_skirt, white_shirt, sailor_collar, simple_background, white_background, holding, red_ribbon |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | balloon | detached_sleeves | looking_at_viewer | open_mouth | solo | :d | pink_dress | puffy_short_sleeves | frills | full_body | high_heels | pink_footwear | white_background | white_thighhighs | bare_shoulders | blush | bow | cleavage_cutout | hair_rings | jingle_bell | petals | simple_background | standing_on_one_leg | tiara | very_long_hair | virtual_youtuber | neck_bell | kimono | long_sleeves | red_skirt | 2girls | wide_sleeves | off_shoulder | pleated_skirt | white_shirt | sailor_collar | holding | red_ribbon |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:----------|:-------------------|:--------------------|:-------------|:-------|:-----|:-------------|:----------------------|:---------|:------------|:-------------|:----------------|:-------------------|:-------------------|:-----------------|:--------|:------|:------------------|:-------------|:--------------|:---------|:--------------------|:----------------------|:--------|:-----------------|:-------------------|:------------|:---------|:---------------|:------------|:---------|:---------------|:---------------|:----------------|:--------------|:----------------|:----------|:-------------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 1 | 15 |  |  |  |  |  | | | | X | X | | X | | | | | | | X | | X | X | | | | X | | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/michishio_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T23:48:02+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T23:54:25+00:00 |
8bf59d39e039531315f674d8fd5d245a13014d59 |
# Dataset Card for Evaluation run of Kquant03/Ryu-4x7B-MoE-bf16
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Kquant03/Ryu-4x7B-MoE-bf16](https://huggingface.co/Kquant03/Ryu-4x7B-MoE-bf16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Kquant03__Ryu-4x7B-MoE-bf16",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T23:51:35.789085](https://huggingface.co/datasets/open-llm-leaderboard/details_Kquant03__Ryu-4x7B-MoE-bf16/blob/main/results_2024-01-13T23-51-35.789085.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6396004158808674,
"acc_stderr": 0.032332778374865194,
"acc_norm": 0.6426234407115231,
"acc_norm_stderr": 0.03298221583193354,
"mc1": 0.49938800489596086,
"mc1_stderr": 0.01750348793889251,
"mc2": 0.649568492897451,
"mc2_stderr": 0.015609242157624164
},
"harness|arc:challenge|25": {
"acc": 0.643344709897611,
"acc_stderr": 0.013998056902620196,
"acc_norm": 0.6646757679180887,
"acc_norm_stderr": 0.01379618294778556
},
"harness|hellaswag|10": {
"acc": 0.6634136626170085,
"acc_stderr": 0.004715762925037027,
"acc_norm": 0.831009759012149,
"acc_norm_stderr": 0.0037397742854185247
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7302631578947368,
"acc_stderr": 0.03611780560284898,
"acc_norm": 0.7302631578947368,
"acc_norm_stderr": 0.03611780560284898
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.028152837942493857,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.028152837942493857
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7291666666666666,
"acc_stderr": 0.03716177437566017,
"acc_norm": 0.7291666666666666,
"acc_norm_stderr": 0.03716177437566017
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5574468085106383,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.5574468085106383,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3994708994708995,
"acc_stderr": 0.02522545028406788,
"acc_norm": 0.3994708994708995,
"acc_norm_stderr": 0.02522545028406788
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.023415293433568525,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.023415293433568525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7393939393939394,
"acc_stderr": 0.034277431758165236,
"acc_norm": 0.7393939393939394,
"acc_norm_stderr": 0.034277431758165236
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7525252525252525,
"acc_stderr": 0.030746300742124484,
"acc_norm": 0.7525252525252525,
"acc_norm_stderr": 0.030746300742124484
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.917098445595855,
"acc_stderr": 0.01989934131572178,
"acc_norm": 0.917098445595855,
"acc_norm_stderr": 0.01989934131572178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6538461538461539,
"acc_stderr": 0.02412112541694119,
"acc_norm": 0.6538461538461539,
"acc_norm_stderr": 0.02412112541694119
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028742040903948482,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028742040903948482
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.0302839955258844,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.0302839955258844
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.01555580271359017,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.01555580271359017
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.025195658428931796,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.025195658428931796
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7763713080168776,
"acc_stderr": 0.027123298205229966,
"acc_norm": 0.7763713080168776,
"acc_norm_stderr": 0.027123298205229966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7085201793721974,
"acc_stderr": 0.030500283176545843,
"acc_norm": 0.7085201793721974,
"acc_norm_stderr": 0.030500283176545843
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281365,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281365
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8212005108556832,
"acc_stderr": 0.013702643715368985,
"acc_norm": 0.8212005108556832,
"acc_norm_stderr": 0.013702643715368985
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7052023121387283,
"acc_stderr": 0.024547617794803828,
"acc_norm": 0.7052023121387283,
"acc_norm_stderr": 0.024547617794803828
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4547486033519553,
"acc_stderr": 0.016653875777524006,
"acc_norm": 0.4547486033519553,
"acc_norm_stderr": 0.016653875777524006
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7124183006535948,
"acc_stderr": 0.02591780611714716,
"acc_norm": 0.7124183006535948,
"acc_norm_stderr": 0.02591780611714716
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6881028938906752,
"acc_stderr": 0.026311858071854155,
"acc_norm": 0.6881028938906752,
"acc_norm_stderr": 0.026311858071854155
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7191358024691358,
"acc_stderr": 0.02500646975579921,
"acc_norm": 0.7191358024691358,
"acc_norm_stderr": 0.02500646975579921
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4641460234680574,
"acc_stderr": 0.012737361318730581,
"acc_norm": 0.4641460234680574,
"acc_norm_stderr": 0.012737361318730581
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.0290294228156814,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.0290294228156814
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6601307189542484,
"acc_stderr": 0.019162418588623557,
"acc_norm": 0.6601307189542484,
"acc_norm_stderr": 0.019162418588623557
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.689795918367347,
"acc_stderr": 0.029613459872484378,
"acc_norm": 0.689795918367347,
"acc_norm_stderr": 0.029613459872484378
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8208955223880597,
"acc_stderr": 0.027113286753111837,
"acc_norm": 0.8208955223880597,
"acc_norm_stderr": 0.027113286753111837
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776334,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776334
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.49938800489596086,
"mc1_stderr": 0.01750348793889251,
"mc2": 0.649568492897451,
"mc2_stderr": 0.015609242157624164
},
"harness|winogrande|5": {
"acc": 0.7924230465666929,
"acc_stderr": 0.011398593419386798
},
"harness|gsm8k|5": {
"acc": 0.4973464746019712,
"acc_stderr": 0.01377229076885817
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Kquant03__Ryu-4x7B-MoE-bf16 | [
"region:us"
] | 2024-01-13T23:53:53+00:00 | {"pretty_name": "Evaluation run of Kquant03/Ryu-4x7B-MoE-bf16", "dataset_summary": "Dataset automatically created during the evaluation run of model [Kquant03/Ryu-4x7B-MoE-bf16](https://huggingface.co/Kquant03/Ryu-4x7B-MoE-bf16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Kquant03__Ryu-4x7B-MoE-bf16\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-13T23:51:35.789085](https://huggingface.co/datasets/open-llm-leaderboard/details_Kquant03__Ryu-4x7B-MoE-bf16/blob/main/results_2024-01-13T23-51-35.789085.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6396004158808674,\n \"acc_stderr\": 0.032332778374865194,\n \"acc_norm\": 0.6426234407115231,\n \"acc_norm_stderr\": 0.03298221583193354,\n \"mc1\": 0.49938800489596086,\n \"mc1_stderr\": 0.01750348793889251,\n \"mc2\": 0.649568492897451,\n \"mc2_stderr\": 0.015609242157624164\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.643344709897611,\n \"acc_stderr\": 0.013998056902620196,\n \"acc_norm\": 0.6646757679180887,\n \"acc_norm_stderr\": 0.01379618294778556\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6634136626170085,\n \"acc_stderr\": 0.004715762925037027,\n \"acc_norm\": 0.831009759012149,\n \"acc_norm_stderr\": 0.0037397742854185247\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.6222222222222222,\n \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7302631578947368,\n \"acc_stderr\": 0.03611780560284898,\n \"acc_norm\": 0.7302631578947368,\n \"acc_norm_stderr\": 0.03611780560284898\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.028152837942493857,\n \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.028152837942493857\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7291666666666666,\n \"acc_stderr\": 0.03716177437566017,\n \"acc_norm\": 0.7291666666666666,\n \"acc_norm_stderr\": 0.03716177437566017\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.03246956919789958,\n \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.03246956919789958\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3994708994708995,\n \"acc_stderr\": 0.02522545028406788,\n \"acc_norm\": 0.3994708994708995,\n \"acc_norm_stderr\": 0.02522545028406788\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n \"acc_stderr\": 0.023415293433568525,\n \"acc_norm\": 0.7838709677419354,\n \"acc_norm_stderr\": 0.023415293433568525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7393939393939394,\n \"acc_stderr\": 0.034277431758165236,\n \"acc_norm\": 0.7393939393939394,\n \"acc_norm_stderr\": 0.034277431758165236\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7525252525252525,\n \"acc_stderr\": 0.030746300742124484,\n \"acc_norm\": 0.7525252525252525,\n \"acc_norm_stderr\": 0.030746300742124484\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.917098445595855,\n \"acc_stderr\": 0.01989934131572178,\n \"acc_norm\": 0.917098445595855,\n \"acc_norm_stderr\": 0.01989934131572178\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6538461538461539,\n \"acc_stderr\": 0.02412112541694119,\n \"acc_norm\": 0.6538461538461539,\n \"acc_norm_stderr\": 0.02412112541694119\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948482,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948482\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.0302839955258844,\n \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.0302839955258844\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8440366972477065,\n \"acc_stderr\": 0.01555580271359017,\n \"acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.01555580271359017\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931796,\n \"acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931796\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7763713080168776,\n \"acc_stderr\": 0.027123298205229966,\n \"acc_norm\": 0.7763713080168776,\n \"acc_norm_stderr\": 0.027123298205229966\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7085201793721974,\n \"acc_stderr\": 0.030500283176545843,\n \"acc_norm\": 0.7085201793721974,\n \"acc_norm_stderr\": 0.030500283176545843\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n \"acc_stderr\": 0.021586494001281365,\n \"acc_norm\": 0.8760683760683761,\n \"acc_norm_stderr\": 0.021586494001281365\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8212005108556832,\n \"acc_stderr\": 0.013702643715368985,\n \"acc_norm\": 0.8212005108556832,\n \"acc_norm_stderr\": 0.013702643715368985\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7052023121387283,\n \"acc_stderr\": 0.024547617794803828,\n \"acc_norm\": 0.7052023121387283,\n \"acc_norm_stderr\": 0.024547617794803828\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4547486033519553,\n \"acc_stderr\": 0.016653875777524006,\n \"acc_norm\": 0.4547486033519553,\n \"acc_norm_stderr\": 0.016653875777524006\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7124183006535948,\n \"acc_stderr\": 0.02591780611714716,\n \"acc_norm\": 0.7124183006535948,\n \"acc_norm_stderr\": 0.02591780611714716\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6881028938906752,\n \"acc_stderr\": 0.026311858071854155,\n \"acc_norm\": 0.6881028938906752,\n \"acc_norm_stderr\": 0.026311858071854155\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7191358024691358,\n \"acc_stderr\": 0.02500646975579921,\n \"acc_norm\": 0.7191358024691358,\n \"acc_norm_stderr\": 0.02500646975579921\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4641460234680574,\n \"acc_stderr\": 0.012737361318730581,\n \"acc_norm\": 0.4641460234680574,\n \"acc_norm_stderr\": 0.012737361318730581\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.0290294228156814,\n \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.0290294228156814\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6601307189542484,\n \"acc_stderr\": 0.019162418588623557,\n \"acc_norm\": 0.6601307189542484,\n \"acc_norm_stderr\": 0.019162418588623557\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.689795918367347,\n \"acc_stderr\": 0.029613459872484378,\n \"acc_norm\": 0.689795918367347,\n \"acc_norm_stderr\": 0.029613459872484378\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8208955223880597,\n \"acc_stderr\": 0.027113286753111837,\n \"acc_norm\": 0.8208955223880597,\n \"acc_norm_stderr\": 0.027113286753111837\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.49938800489596086,\n \"mc1_stderr\": 0.01750348793889251,\n \"mc2\": 0.649568492897451,\n \"mc2_stderr\": 0.015609242157624164\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7924230465666929,\n \"acc_stderr\": 0.011398593419386798\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4973464746019712,\n \"acc_stderr\": 0.01377229076885817\n }\n}\n```", "repo_url": "https://huggingface.co/Kquant03/Ryu-4x7B-MoE-bf16", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|arc:challenge|25_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|gsm8k|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hellaswag|10_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|winogrande|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["results_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["results_2024-01-13T23-51-35.789085.parquet"]}]}]} | 2024-01-13T23:54:14+00:00 |
18c6dda7ced23ea3e009ee7eb875d0cc9e6946ad | Budzisnki/Teste_agro | [
"license:openrail",
"region:us"
] | 2024-01-13T23:56:25+00:00 | {"license": "openrail"} | 2024-01-13T23:57:24+00:00 |
|
91df9b666b59a06f1acf314dc70db259b55006f2 | elprofecoss/sentiment-banking | [
"region:us"
] | 2024-01-14T00:05:45+00:00 | {"dataset_info": {"features": [{"name": "text", "dtype": "string"}, {"name": "inputs", "struct": [{"name": "text", "dtype": "string"}]}, {"name": "prediction", "list": [{"name": "label", "dtype": "string"}, {"name": "score", "dtype": "float64"}]}, {"name": "prediction_agent", "dtype": "string"}, {"name": "annotation", "dtype": "null"}, {"name": "annotation_agent", "dtype": "null"}, {"name": "multi_label", "dtype": "bool"}, {"name": "explanation", "dtype": "null"}, {"name": "id", "dtype": "null"}, {"name": "metadata", "struct": [{"name": "category", "dtype": "int64"}]}, {"name": "status", "dtype": "string"}, {"name": "event_timestamp", "dtype": "null"}, {"name": "metrics", "dtype": "null"}], "splits": [{"name": "train", "num_bytes": 1205760, "num_examples": 5001}], "download_size": 449589, "dataset_size": 1205760}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-14T20:03:49+00:00 |
|
cbdbe0de5538bd5c0d58a03b7da9881547e22694 | MatsuoDochiai/Joao | [
"license:openrail",
"region:us"
] | 2024-01-14T00:09:08+00:00 | {"license": "openrail"} | 2024-01-14T00:09:49+00:00 |
|
079a1d3982e8b6fd3edbdc03c39b43415be25326 | Praghxx/nickkit | [
"region:us"
] | 2024-01-14T00:18:57+00:00 | {} | 2024-01-14T00:19:31+00:00 |
|
54c85b373fbea17248c534193bd9a854fee1f832 | natolambert/interconnects-figures | [
"region:us"
] | 2024-01-14T00:30:00+00:00 | {} | 2024-02-16T04:23:29+00:00 |
|
d2f3ee80877c5b30f178503ce6a92831fa9341a9 | Berzerker/iapr_tcr11 | [
"language:en",
"region:us"
] | 2024-01-14T00:33:39+00:00 | {"language": ["en"], "dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "output_json_dumpsed", "dtype": "string"}]}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/*.parquet"}]}]} | 2024-01-15T04:02:20+00:00 |
|
acfebe75504019b17d16c1152255faa14c1c5ecb |
# Dataset Card for Evaluation run of jefferylovely/AiMaven-Orca2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [jefferylovely/AiMaven-Orca2](https://huggingface.co/jefferylovely/AiMaven-Orca2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jefferylovely__AiMaven-Orca2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T00:32:07.397103](https://huggingface.co/datasets/open-llm-leaderboard/details_jefferylovely__AiMaven-Orca2/blob/main/results_2024-01-14T00-32-07.397103.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.54465465733523,
"acc_stderr": 0.034129622181532,
"acc_norm": 0.5502915050766849,
"acc_norm_stderr": 0.03486859931303051,
"mc1": 0.3659730722154223,
"mc1_stderr": 0.016862941684088365,
"mc2": 0.5343298654242948,
"mc2_stderr": 0.01618337374565952
},
"harness|arc:challenge|25": {
"acc": 0.5187713310580204,
"acc_stderr": 0.014601090150633964,
"acc_norm": 0.5469283276450512,
"acc_norm_stderr": 0.014546892052005628
},
"harness|hellaswag|10": {
"acc": 0.6054570802628958,
"acc_stderr": 0.004877534215987091,
"acc_norm": 0.789982075283808,
"acc_norm_stderr": 0.0040648854960034396
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5333333333333333,
"acc_stderr": 0.043097329010363554,
"acc_norm": 0.5333333333333333,
"acc_norm_stderr": 0.043097329010363554
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.631578947368421,
"acc_stderr": 0.039255233810529325,
"acc_norm": 0.631578947368421,
"acc_norm_stderr": 0.039255233810529325
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6075471698113207,
"acc_stderr": 0.03005258057955784,
"acc_norm": 0.6075471698113207,
"acc_norm_stderr": 0.03005258057955784
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5763888888888888,
"acc_stderr": 0.0413212501972337,
"acc_norm": 0.5763888888888888,
"acc_norm_stderr": 0.0413212501972337
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562429,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562429
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5433526011560693,
"acc_stderr": 0.03798106566014498,
"acc_norm": 0.5433526011560693,
"acc_norm_stderr": 0.03798106566014498
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.046550104113196177,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.046550104113196177
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4723404255319149,
"acc_stderr": 0.03263597118409769,
"acc_norm": 0.4723404255319149,
"acc_norm_stderr": 0.03263597118409769
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.32456140350877194,
"acc_stderr": 0.04404556157374767,
"acc_norm": 0.32456140350877194,
"acc_norm_stderr": 0.04404556157374767
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4482758620689655,
"acc_stderr": 0.04144311810878151,
"acc_norm": 0.4482758620689655,
"acc_norm_stderr": 0.04144311810878151
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.024796060602699947,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.024796060602699947
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3253968253968254,
"acc_stderr": 0.04190596438871136,
"acc_norm": 0.3253968253968254,
"acc_norm_stderr": 0.04190596438871136
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237103,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237103
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6387096774193548,
"acc_stderr": 0.02732754844795754,
"acc_norm": 0.6387096774193548,
"acc_norm_stderr": 0.02732754844795754
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4433497536945813,
"acc_stderr": 0.03495334582162934,
"acc_norm": 0.4433497536945813,
"acc_norm_stderr": 0.03495334582162934
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.033744026441394036,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.033744026441394036
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6616161616161617,
"acc_stderr": 0.03371124142626303,
"acc_norm": 0.6616161616161617,
"acc_norm_stderr": 0.03371124142626303
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7616580310880829,
"acc_stderr": 0.030748905363909895,
"acc_norm": 0.7616580310880829,
"acc_norm_stderr": 0.030748905363909895
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5384615384615384,
"acc_stderr": 0.025275892070240637,
"acc_norm": 0.5384615384615384,
"acc_norm_stderr": 0.025275892070240637
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3074074074074074,
"acc_stderr": 0.028133252578815635,
"acc_norm": 0.3074074074074074,
"acc_norm_stderr": 0.028133252578815635
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5630252100840336,
"acc_stderr": 0.032219436365661956,
"acc_norm": 0.5630252100840336,
"acc_norm_stderr": 0.032219436365661956
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.726605504587156,
"acc_stderr": 0.01910929984609829,
"acc_norm": 0.726605504587156,
"acc_norm_stderr": 0.01910929984609829
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.44907407407407407,
"acc_stderr": 0.03392238405321617,
"acc_norm": 0.44907407407407407,
"acc_norm_stderr": 0.03392238405321617
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7401960784313726,
"acc_stderr": 0.030778554678693268,
"acc_norm": 0.7401960784313726,
"acc_norm_stderr": 0.030778554678693268
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.729957805907173,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.729957805907173,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6412556053811659,
"acc_stderr": 0.032190792004199956,
"acc_norm": 0.6412556053811659,
"acc_norm_stderr": 0.032190792004199956
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.648854961832061,
"acc_stderr": 0.04186445163013751,
"acc_norm": 0.648854961832061,
"acc_norm_stderr": 0.04186445163013751
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.043913262867240704,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.043913262867240704
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.04414343666854933,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.04414343666854933
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6625766871165644,
"acc_stderr": 0.03714908409935574,
"acc_norm": 0.6625766871165644,
"acc_norm_stderr": 0.03714908409935574
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.375,
"acc_stderr": 0.04595091388086298,
"acc_norm": 0.375,
"acc_norm_stderr": 0.04595091388086298
},
"harness|hendrycksTest-management|5": {
"acc": 0.7087378640776699,
"acc_stderr": 0.04498676320572924,
"acc_norm": 0.7087378640776699,
"acc_norm_stderr": 0.04498676320572924
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.027236013946196694,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.027236013946196694
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7381864623243933,
"acc_stderr": 0.015720838678445266,
"acc_norm": 0.7381864623243933,
"acc_norm_stderr": 0.015720838678445266
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5982658959537572,
"acc_stderr": 0.026394104177643634,
"acc_norm": 0.5982658959537572,
"acc_norm_stderr": 0.026394104177643634
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574904,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574904
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5915032679738562,
"acc_stderr": 0.028146405993096358,
"acc_norm": 0.5915032679738562,
"acc_norm_stderr": 0.028146405993096358
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6109324758842444,
"acc_stderr": 0.027690337536485372,
"acc_norm": 0.6109324758842444,
"acc_norm_stderr": 0.027690337536485372
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6265432098765432,
"acc_stderr": 0.02691500301138015,
"acc_norm": 0.6265432098765432,
"acc_norm_stderr": 0.02691500301138015
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.40070921985815605,
"acc_stderr": 0.02923346574557309,
"acc_norm": 0.40070921985815605,
"acc_norm_stderr": 0.02923346574557309
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.38070404172099087,
"acc_stderr": 0.012401430654645893,
"acc_norm": 0.38070404172099087,
"acc_norm_stderr": 0.012401430654645893
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5073529411764706,
"acc_stderr": 0.030369552523902173,
"acc_norm": 0.5073529411764706,
"acc_norm_stderr": 0.030369552523902173
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5343137254901961,
"acc_stderr": 0.020180144843307293,
"acc_norm": 0.5343137254901961,
"acc_norm_stderr": 0.020180144843307293
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5363636363636364,
"acc_stderr": 0.04776449162396197,
"acc_norm": 0.5363636363636364,
"acc_norm_stderr": 0.04776449162396197
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6571428571428571,
"acc_stderr": 0.030387262919547728,
"acc_norm": 0.6571428571428571,
"acc_norm_stderr": 0.030387262919547728
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5472636815920398,
"acc_stderr": 0.03519702717576915,
"acc_norm": 0.5472636815920398,
"acc_norm_stderr": 0.03519702717576915
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816508,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816508
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4879518072289157,
"acc_stderr": 0.038913644958358196,
"acc_norm": 0.4879518072289157,
"acc_norm_stderr": 0.038913644958358196
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7368421052631579,
"acc_stderr": 0.03377310252209204,
"acc_norm": 0.7368421052631579,
"acc_norm_stderr": 0.03377310252209204
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3659730722154223,
"mc1_stderr": 0.016862941684088365,
"mc2": 0.5343298654242948,
"mc2_stderr": 0.01618337374565952
},
"harness|winogrande|5": {
"acc": 0.7434885556432518,
"acc_stderr": 0.012273648008759987
},
"harness|gsm8k|5": {
"acc": 0.2259287338893101,
"acc_stderr": 0.011519098777279958
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_jefferylovely__AiMaven-Orca2 | [
"region:us"
] | 2024-01-14T00:34:23+00:00 | {"pretty_name": "Evaluation run of jefferylovely/AiMaven-Orca2", "dataset_summary": "Dataset automatically created during the evaluation run of model [jefferylovely/AiMaven-Orca2](https://huggingface.co/jefferylovely/AiMaven-Orca2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jefferylovely__AiMaven-Orca2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-14T00:32:07.397103](https://huggingface.co/datasets/open-llm-leaderboard/details_jefferylovely__AiMaven-Orca2/blob/main/results_2024-01-14T00-32-07.397103.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.54465465733523,\n \"acc_stderr\": 0.034129622181532,\n \"acc_norm\": 0.5502915050766849,\n \"acc_norm_stderr\": 0.03486859931303051,\n \"mc1\": 0.3659730722154223,\n \"mc1_stderr\": 0.016862941684088365,\n \"mc2\": 0.5343298654242948,\n \"mc2_stderr\": 0.01618337374565952\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5187713310580204,\n \"acc_stderr\": 0.014601090150633964,\n \"acc_norm\": 0.5469283276450512,\n \"acc_norm_stderr\": 0.014546892052005628\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6054570802628958,\n \"acc_stderr\": 0.004877534215987091,\n \"acc_norm\": 0.789982075283808,\n \"acc_norm_stderr\": 0.0040648854960034396\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5333333333333333,\n \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.5333333333333333,\n \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.631578947368421,\n \"acc_stderr\": 0.039255233810529325,\n \"acc_norm\": 0.631578947368421,\n \"acc_norm_stderr\": 0.039255233810529325\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6075471698113207,\n \"acc_stderr\": 0.03005258057955784,\n \"acc_norm\": 0.6075471698113207,\n \"acc_norm_stderr\": 0.03005258057955784\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5763888888888888,\n \"acc_stderr\": 0.0413212501972337,\n \"acc_norm\": 0.5763888888888888,\n \"acc_norm_stderr\": 0.0413212501972337\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562429,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562429\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5433526011560693,\n \"acc_stderr\": 0.03798106566014498,\n \"acc_norm\": 0.5433526011560693,\n \"acc_norm_stderr\": 0.03798106566014498\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.046550104113196177,\n \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.046550104113196177\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4723404255319149,\n \"acc_stderr\": 0.03263597118409769,\n \"acc_norm\": 0.4723404255319149,\n \"acc_norm_stderr\": 0.03263597118409769\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.32456140350877194,\n \"acc_stderr\": 0.04404556157374767,\n \"acc_norm\": 0.32456140350877194,\n \"acc_norm_stderr\": 0.04404556157374767\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.4482758620689655,\n \"acc_stderr\": 0.04144311810878151,\n \"acc_norm\": 0.4482758620689655,\n \"acc_norm_stderr\": 0.04144311810878151\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.36507936507936506,\n \"acc_stderr\": 0.024796060602699947,\n \"acc_norm\": 0.36507936507936506,\n \"acc_norm_stderr\": 0.024796060602699947\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3253968253968254,\n \"acc_stderr\": 0.04190596438871136,\n \"acc_norm\": 0.3253968253968254,\n \"acc_norm_stderr\": 0.04190596438871136\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237103,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237103\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6387096774193548,\n \"acc_stderr\": 0.02732754844795754,\n \"acc_norm\": 0.6387096774193548,\n \"acc_norm_stderr\": 0.02732754844795754\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4433497536945813,\n \"acc_stderr\": 0.03495334582162934,\n \"acc_norm\": 0.4433497536945813,\n \"acc_norm_stderr\": 0.03495334582162934\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.033744026441394036,\n \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.033744026441394036\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6616161616161617,\n \"acc_stderr\": 0.03371124142626303,\n \"acc_norm\": 0.6616161616161617,\n \"acc_norm_stderr\": 0.03371124142626303\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7616580310880829,\n \"acc_stderr\": 0.030748905363909895,\n \"acc_norm\": 0.7616580310880829,\n \"acc_norm_stderr\": 0.030748905363909895\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5384615384615384,\n \"acc_stderr\": 0.025275892070240637,\n \"acc_norm\": 0.5384615384615384,\n \"acc_norm_stderr\": 0.025275892070240637\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3074074074074074,\n \"acc_stderr\": 0.028133252578815635,\n \"acc_norm\": 0.3074074074074074,\n \"acc_norm_stderr\": 0.028133252578815635\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5630252100840336,\n \"acc_stderr\": 0.032219436365661956,\n \"acc_norm\": 0.5630252100840336,\n \"acc_norm_stderr\": 0.032219436365661956\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.726605504587156,\n \"acc_stderr\": 0.01910929984609829,\n \"acc_norm\": 0.726605504587156,\n \"acc_norm_stderr\": 0.01910929984609829\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.44907407407407407,\n \"acc_stderr\": 0.03392238405321617,\n \"acc_norm\": 0.44907407407407407,\n \"acc_norm_stderr\": 0.03392238405321617\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7401960784313726,\n \"acc_stderr\": 0.030778554678693268,\n \"acc_norm\": 0.7401960784313726,\n \"acc_norm_stderr\": 0.030778554678693268\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.729957805907173,\n \"acc_stderr\": 0.028900721906293426,\n \"acc_norm\": 0.729957805907173,\n \"acc_norm_stderr\": 0.028900721906293426\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6412556053811659,\n \"acc_stderr\": 0.032190792004199956,\n \"acc_norm\": 0.6412556053811659,\n \"acc_norm_stderr\": 0.032190792004199956\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.648854961832061,\n \"acc_stderr\": 0.04186445163013751,\n \"acc_norm\": 0.648854961832061,\n \"acc_norm_stderr\": 0.04186445163013751\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.043913262867240704,\n \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.043913262867240704\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7037037037037037,\n \"acc_stderr\": 0.04414343666854933,\n \"acc_norm\": 0.7037037037037037,\n \"acc_norm_stderr\": 0.04414343666854933\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6625766871165644,\n \"acc_stderr\": 0.03714908409935574,\n \"acc_norm\": 0.6625766871165644,\n \"acc_norm_stderr\": 0.03714908409935574\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.375,\n \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.375,\n \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7087378640776699,\n \"acc_stderr\": 0.04498676320572924,\n \"acc_norm\": 0.7087378640776699,\n \"acc_norm_stderr\": 0.04498676320572924\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.027236013946196694,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.027236013946196694\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7381864623243933,\n \"acc_stderr\": 0.015720838678445266,\n \"acc_norm\": 0.7381864623243933,\n \"acc_norm_stderr\": 0.015720838678445266\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5982658959537572,\n \"acc_stderr\": 0.026394104177643634,\n \"acc_norm\": 0.5982658959537572,\n \"acc_norm_stderr\": 0.026394104177643634\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n \"acc_stderr\": 0.014242630070574904,\n \"acc_norm\": 0.23798882681564246,\n \"acc_norm_stderr\": 0.014242630070574904\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5915032679738562,\n \"acc_stderr\": 0.028146405993096358,\n \"acc_norm\": 0.5915032679738562,\n \"acc_norm_stderr\": 0.028146405993096358\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6109324758842444,\n \"acc_stderr\": 0.027690337536485372,\n \"acc_norm\": 0.6109324758842444,\n \"acc_norm_stderr\": 0.027690337536485372\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6265432098765432,\n \"acc_stderr\": 0.02691500301138015,\n \"acc_norm\": 0.6265432098765432,\n \"acc_norm_stderr\": 0.02691500301138015\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.40070921985815605,\n \"acc_stderr\": 0.02923346574557309,\n \"acc_norm\": 0.40070921985815605,\n \"acc_norm_stderr\": 0.02923346574557309\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.38070404172099087,\n \"acc_stderr\": 0.012401430654645893,\n \"acc_norm\": 0.38070404172099087,\n \"acc_norm_stderr\": 0.012401430654645893\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5073529411764706,\n \"acc_stderr\": 0.030369552523902173,\n \"acc_norm\": 0.5073529411764706,\n \"acc_norm_stderr\": 0.030369552523902173\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5343137254901961,\n \"acc_stderr\": 0.020180144843307293,\n \"acc_norm\": 0.5343137254901961,\n \"acc_norm_stderr\": 0.020180144843307293\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5363636363636364,\n \"acc_stderr\": 0.04776449162396197,\n \"acc_norm\": 0.5363636363636364,\n \"acc_norm_stderr\": 0.04776449162396197\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6571428571428571,\n \"acc_stderr\": 0.030387262919547728,\n \"acc_norm\": 0.6571428571428571,\n \"acc_norm_stderr\": 0.030387262919547728\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5472636815920398,\n \"acc_stderr\": 0.03519702717576915,\n \"acc_norm\": 0.5472636815920398,\n \"acc_norm_stderr\": 0.03519702717576915\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816508,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816508\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4879518072289157,\n \"acc_stderr\": 0.038913644958358196,\n \"acc_norm\": 0.4879518072289157,\n \"acc_norm_stderr\": 0.038913644958358196\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7368421052631579,\n \"acc_stderr\": 0.03377310252209204,\n \"acc_norm\": 0.7368421052631579,\n \"acc_norm_stderr\": 0.03377310252209204\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3659730722154223,\n \"mc1_stderr\": 0.016862941684088365,\n \"mc2\": 0.5343298654242948,\n \"mc2_stderr\": 0.01618337374565952\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7434885556432518,\n \"acc_stderr\": 0.012273648008759987\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2259287338893101,\n \"acc_stderr\": 0.011519098777279958\n }\n}\n```", "repo_url": "https://huggingface.co/jefferylovely/AiMaven-Orca2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|arc:challenge|25_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|gsm8k|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hellaswag|10_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|winogrande|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["results_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["results_2024-01-14T00-32-07.397103.parquet"]}]}]} | 2024-01-14T00:34:45+00:00 |
5ed688ed6e1be468a2c8aa53ae5394a91f408e96 |
# Dataset Card for Evaluation run of ibndias/Nous-Hermes-2-MoE-2x34B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ibndias/Nous-Hermes-2-MoE-2x34B](https://huggingface.co/ibndias/Nous-Hermes-2-MoE-2x34B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ibndias__Nous-Hermes-2-MoE-2x34B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T00:41:59.190674](https://huggingface.co/datasets/open-llm-leaderboard/details_ibndias__Nous-Hermes-2-MoE-2x34B/blob/main/results_2024-01-14T00-41-59.190674.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.761185340730833,
"acc_stderr": 0.02810264232361143,
"acc_norm": 0.7648166441855127,
"acc_norm_stderr": 0.02863812731410329,
"mc1": 0.41982864137086906,
"mc1_stderr": 0.01727703030177577,
"mc2": 0.5808164969122677,
"mc2_stderr": 0.014977589951125109
},
"harness|arc:challenge|25": {
"acc": 0.6424914675767918,
"acc_stderr": 0.014005494275916573,
"acc_norm": 0.6663822525597269,
"acc_norm_stderr": 0.013778687054176534
},
"harness|hellaswag|10": {
"acc": 0.6606253734315873,
"acc_stderr": 0.004725293905228259,
"acc_norm": 0.8572993427604063,
"acc_norm_stderr": 0.003490524965061915
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.03785714465066653,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.03785714465066653
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.9013157894736842,
"acc_stderr": 0.024270227737522715,
"acc_norm": 0.9013157894736842,
"acc_norm_stderr": 0.024270227737522715
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7962264150943397,
"acc_stderr": 0.024790784501775402,
"acc_norm": 0.7962264150943397,
"acc_norm_stderr": 0.024790784501775402
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8958333333333334,
"acc_stderr": 0.025545239210256917,
"acc_norm": 0.8958333333333334,
"acc_norm_stderr": 0.025545239210256917
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.64,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6994219653179191,
"acc_stderr": 0.0349610148119118,
"acc_norm": 0.6994219653179191,
"acc_norm_stderr": 0.0349610148119118
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5294117647058824,
"acc_stderr": 0.049665709039785295,
"acc_norm": 0.5294117647058824,
"acc_norm_stderr": 0.049665709039785295
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7787234042553192,
"acc_stderr": 0.027136349602424056,
"acc_norm": 0.7787234042553192,
"acc_norm_stderr": 0.027136349602424056
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5701754385964912,
"acc_stderr": 0.04657047260594964,
"acc_norm": 0.5701754385964912,
"acc_norm_stderr": 0.04657047260594964
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7862068965517242,
"acc_stderr": 0.034165204477475494,
"acc_norm": 0.7862068965517242,
"acc_norm_stderr": 0.034165204477475494
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.6746031746031746,
"acc_stderr": 0.024130158299762613,
"acc_norm": 0.6746031746031746,
"acc_norm_stderr": 0.024130158299762613
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5634920634920635,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.5634920634920635,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8903225806451613,
"acc_stderr": 0.01777677870048519,
"acc_norm": 0.8903225806451613,
"acc_norm_stderr": 0.01777677870048519
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.625615763546798,
"acc_stderr": 0.03405155380561952,
"acc_norm": 0.625615763546798,
"acc_norm_stderr": 0.03405155380561952
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8727272727272727,
"acc_stderr": 0.026024657651656187,
"acc_norm": 0.8727272727272727,
"acc_norm_stderr": 0.026024657651656187
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.02239078763821677,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.02239078763821677
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9792746113989638,
"acc_stderr": 0.010281417011909042,
"acc_norm": 0.9792746113989638,
"acc_norm_stderr": 0.010281417011909042
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7923076923076923,
"acc_stderr": 0.020567539567246787,
"acc_norm": 0.7923076923076923,
"acc_norm_stderr": 0.020567539567246787
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.4703703703703704,
"acc_stderr": 0.030431963547936584,
"acc_norm": 0.4703703703703704,
"acc_norm_stderr": 0.030431963547936584
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8529411764705882,
"acc_stderr": 0.023005459446673936,
"acc_norm": 0.8529411764705882,
"acc_norm_stderr": 0.023005459446673936
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5099337748344371,
"acc_stderr": 0.04081677107248437,
"acc_norm": 0.5099337748344371,
"acc_norm_stderr": 0.04081677107248437
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9247706422018349,
"acc_stderr": 0.011308662537571762,
"acc_norm": 0.9247706422018349,
"acc_norm_stderr": 0.011308662537571762
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03214952147802749,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03214952147802749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9215686274509803,
"acc_stderr": 0.018869514646658928,
"acc_norm": 0.9215686274509803,
"acc_norm_stderr": 0.018869514646658928
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.919831223628692,
"acc_stderr": 0.01767667999189163,
"acc_norm": 0.919831223628692,
"acc_norm_stderr": 0.01767667999189163
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7892376681614349,
"acc_stderr": 0.02737309550054019,
"acc_norm": 0.7892376681614349,
"acc_norm_stderr": 0.02737309550054019
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8778625954198473,
"acc_stderr": 0.028718776889342344,
"acc_norm": 0.8778625954198473,
"acc_norm_stderr": 0.028718776889342344
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.9173553719008265,
"acc_stderr": 0.025135382356604227,
"acc_norm": 0.9173553719008265,
"acc_norm_stderr": 0.025135382356604227
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8981481481481481,
"acc_stderr": 0.029239272675632748,
"acc_norm": 0.8981481481481481,
"acc_norm_stderr": 0.029239272675632748
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8711656441717791,
"acc_stderr": 0.026321383198783674,
"acc_norm": 0.8711656441717791,
"acc_norm_stderr": 0.026321383198783674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6160714285714286,
"acc_stderr": 0.04616143075028546,
"acc_norm": 0.6160714285714286,
"acc_norm_stderr": 0.04616143075028546
},
"harness|hendrycksTest-management|5": {
"acc": 0.9223300970873787,
"acc_stderr": 0.02650144078476276,
"acc_norm": 0.9223300970873787,
"acc_norm_stderr": 0.02650144078476276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9316239316239316,
"acc_stderr": 0.01653462768431136,
"acc_norm": 0.9316239316239316,
"acc_norm_stderr": 0.01653462768431136
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9054916985951469,
"acc_stderr": 0.01046101533819307,
"acc_norm": 0.9054916985951469,
"acc_norm_stderr": 0.01046101533819307
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8265895953757225,
"acc_stderr": 0.020383229551135026,
"acc_norm": 0.8265895953757225,
"acc_norm_stderr": 0.020383229551135026
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.6346368715083799,
"acc_stderr": 0.01610483388014229,
"acc_norm": 0.6346368715083799,
"acc_norm_stderr": 0.01610483388014229
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8529411764705882,
"acc_stderr": 0.02027940293617458,
"acc_norm": 0.8529411764705882,
"acc_norm_stderr": 0.02027940293617458
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8295819935691319,
"acc_stderr": 0.02135534302826405,
"acc_norm": 0.8295819935691319,
"acc_norm_stderr": 0.02135534302826405
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.01748643278588071,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.01748643278588071
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6595744680851063,
"acc_stderr": 0.028267657482650154,
"acc_norm": 0.6595744680851063,
"acc_norm_stderr": 0.028267657482650154
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.605606258148631,
"acc_stderr": 0.012482141665631176,
"acc_norm": 0.605606258148631,
"acc_norm_stderr": 0.012482141665631176
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8125,
"acc_stderr": 0.023709788253811766,
"acc_norm": 0.8125,
"acc_norm_stderr": 0.023709788253811766
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.015422512066262554,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.015422512066262554
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.043091187099464585,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.043091187099464585
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8367346938775511,
"acc_stderr": 0.023661699177098608,
"acc_norm": 0.8367346938775511,
"acc_norm_stderr": 0.023661699177098608
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8855721393034826,
"acc_stderr": 0.022509345325101706,
"acc_norm": 0.8855721393034826,
"acc_norm_stderr": 0.022509345325101706
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.028762349126466125,
"acc_norm": 0.91,
"acc_norm_stderr": 0.028762349126466125
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5783132530120482,
"acc_stderr": 0.038444531817709175,
"acc_norm": 0.5783132530120482,
"acc_norm_stderr": 0.038444531817709175
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8771929824561403,
"acc_stderr": 0.02517298435015578,
"acc_norm": 0.8771929824561403,
"acc_norm_stderr": 0.02517298435015578
},
"harness|truthfulqa:mc|0": {
"mc1": 0.41982864137086906,
"mc1_stderr": 0.01727703030177577,
"mc2": 0.5808164969122677,
"mc2_stderr": 0.014977589951125109
},
"harness|winogrande|5": {
"acc": 0.8334648776637726,
"acc_stderr": 0.010470796496781103
},
"harness|gsm8k|5": {
"acc": 0.6952236542835482,
"acc_stderr": 0.012679297549515425
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_ibndias__Nous-Hermes-2-MoE-2x34B | [
"region:us"
] | 2024-01-14T00:44:12+00:00 | {"pretty_name": "Evaluation run of ibndias/Nous-Hermes-2-MoE-2x34B", "dataset_summary": "Dataset automatically created during the evaluation run of model [ibndias/Nous-Hermes-2-MoE-2x34B](https://huggingface.co/ibndias/Nous-Hermes-2-MoE-2x34B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ibndias__Nous-Hermes-2-MoE-2x34B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-14T00:41:59.190674](https://huggingface.co/datasets/open-llm-leaderboard/details_ibndias__Nous-Hermes-2-MoE-2x34B/blob/main/results_2024-01-14T00-41-59.190674.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.761185340730833,\n \"acc_stderr\": 0.02810264232361143,\n \"acc_norm\": 0.7648166441855127,\n \"acc_norm_stderr\": 0.02863812731410329,\n \"mc1\": 0.41982864137086906,\n \"mc1_stderr\": 0.01727703030177577,\n \"mc2\": 0.5808164969122677,\n \"mc2_stderr\": 0.014977589951125109\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6424914675767918,\n \"acc_stderr\": 0.014005494275916573,\n \"acc_norm\": 0.6663822525597269,\n \"acc_norm_stderr\": 0.013778687054176534\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6606253734315873,\n \"acc_stderr\": 0.004725293905228259,\n \"acc_norm\": 0.8572993427604063,\n \"acc_norm_stderr\": 0.003490524965061915\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.03785714465066653,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.03785714465066653\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.9013157894736842,\n \"acc_stderr\": 0.024270227737522715,\n \"acc_norm\": 0.9013157894736842,\n \"acc_norm_stderr\": 0.024270227737522715\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7962264150943397,\n \"acc_stderr\": 0.024790784501775402,\n \"acc_norm\": 0.7962264150943397,\n \"acc_norm_stderr\": 0.024790784501775402\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8958333333333334,\n \"acc_stderr\": 0.025545239210256917,\n \"acc_norm\": 0.8958333333333334,\n \"acc_norm_stderr\": 0.025545239210256917\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6994219653179191,\n \"acc_stderr\": 0.0349610148119118,\n \"acc_norm\": 0.6994219653179191,\n \"acc_norm_stderr\": 0.0349610148119118\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.5294117647058824,\n \"acc_stderr\": 0.049665709039785295,\n \"acc_norm\": 0.5294117647058824,\n \"acc_norm_stderr\": 0.049665709039785295\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7787234042553192,\n \"acc_stderr\": 0.027136349602424056,\n \"acc_norm\": 0.7787234042553192,\n \"acc_norm_stderr\": 0.027136349602424056\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5701754385964912,\n \"acc_stderr\": 0.04657047260594964,\n \"acc_norm\": 0.5701754385964912,\n \"acc_norm_stderr\": 0.04657047260594964\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7862068965517242,\n \"acc_stderr\": 0.034165204477475494,\n \"acc_norm\": 0.7862068965517242,\n \"acc_norm_stderr\": 0.034165204477475494\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.6746031746031746,\n \"acc_stderr\": 0.024130158299762613,\n \"acc_norm\": 0.6746031746031746,\n \"acc_norm_stderr\": 0.024130158299762613\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5634920634920635,\n \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.5634920634920635,\n \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8903225806451613,\n \"acc_stderr\": 0.01777677870048519,\n \"acc_norm\": 0.8903225806451613,\n \"acc_norm_stderr\": 0.01777677870048519\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.625615763546798,\n \"acc_stderr\": 0.03405155380561952,\n \"acc_norm\": 0.625615763546798,\n \"acc_norm_stderr\": 0.03405155380561952\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536955,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536955\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8727272727272727,\n \"acc_stderr\": 0.026024657651656187,\n \"acc_norm\": 0.8727272727272727,\n \"acc_norm_stderr\": 0.026024657651656187\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.02239078763821677,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.02239078763821677\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9792746113989638,\n \"acc_stderr\": 0.010281417011909042,\n \"acc_norm\": 0.9792746113989638,\n \"acc_norm_stderr\": 0.010281417011909042\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.7923076923076923,\n \"acc_stderr\": 0.020567539567246787,\n \"acc_norm\": 0.7923076923076923,\n \"acc_norm_stderr\": 0.020567539567246787\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.4703703703703704,\n \"acc_stderr\": 0.030431963547936584,\n \"acc_norm\": 0.4703703703703704,\n \"acc_norm_stderr\": 0.030431963547936584\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8529411764705882,\n \"acc_stderr\": 0.023005459446673936,\n \"acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.023005459446673936\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.5099337748344371,\n \"acc_stderr\": 0.04081677107248437,\n \"acc_norm\": 0.5099337748344371,\n \"acc_norm_stderr\": 0.04081677107248437\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9247706422018349,\n \"acc_stderr\": 0.011308662537571762,\n \"acc_norm\": 0.9247706422018349,\n \"acc_norm_stderr\": 0.011308662537571762\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.03214952147802749,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03214952147802749\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9215686274509803,\n \"acc_stderr\": 0.018869514646658928,\n \"acc_norm\": 0.9215686274509803,\n \"acc_norm_stderr\": 0.018869514646658928\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.919831223628692,\n \"acc_stderr\": 0.01767667999189163,\n \"acc_norm\": 0.919831223628692,\n \"acc_norm_stderr\": 0.01767667999189163\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7892376681614349,\n \"acc_stderr\": 0.02737309550054019,\n \"acc_norm\": 0.7892376681614349,\n \"acc_norm_stderr\": 0.02737309550054019\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8778625954198473,\n \"acc_stderr\": 0.028718776889342344,\n \"acc_norm\": 0.8778625954198473,\n \"acc_norm_stderr\": 0.028718776889342344\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.9173553719008265,\n \"acc_stderr\": 0.025135382356604227,\n \"acc_norm\": 0.9173553719008265,\n \"acc_norm_stderr\": 0.025135382356604227\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8981481481481481,\n \"acc_stderr\": 0.029239272675632748,\n \"acc_norm\": 0.8981481481481481,\n \"acc_norm_stderr\": 0.029239272675632748\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8711656441717791,\n \"acc_stderr\": 0.026321383198783674,\n \"acc_norm\": 0.8711656441717791,\n \"acc_norm_stderr\": 0.026321383198783674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6160714285714286,\n \"acc_stderr\": 0.04616143075028546,\n \"acc_norm\": 0.6160714285714286,\n \"acc_norm_stderr\": 0.04616143075028546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.9223300970873787,\n \"acc_stderr\": 0.02650144078476276,\n \"acc_norm\": 0.9223300970873787,\n \"acc_norm_stderr\": 0.02650144078476276\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9316239316239316,\n \"acc_stderr\": 0.01653462768431136,\n \"acc_norm\": 0.9316239316239316,\n \"acc_norm_stderr\": 0.01653462768431136\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9054916985951469,\n \"acc_stderr\": 0.01046101533819307,\n \"acc_norm\": 0.9054916985951469,\n \"acc_norm_stderr\": 0.01046101533819307\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8265895953757225,\n \"acc_stderr\": 0.020383229551135026,\n \"acc_norm\": 0.8265895953757225,\n \"acc_norm_stderr\": 0.020383229551135026\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6346368715083799,\n \"acc_stderr\": 0.01610483388014229,\n \"acc_norm\": 0.6346368715083799,\n \"acc_norm_stderr\": 0.01610483388014229\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8529411764705882,\n \"acc_stderr\": 0.02027940293617458,\n \"acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.02027940293617458\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8295819935691319,\n \"acc_stderr\": 0.02135534302826405,\n \"acc_norm\": 0.8295819935691319,\n \"acc_norm_stderr\": 0.02135534302826405\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.01748643278588071,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.01748643278588071\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.6595744680851063,\n \"acc_stderr\": 0.028267657482650154,\n \"acc_norm\": 0.6595744680851063,\n \"acc_norm_stderr\": 0.028267657482650154\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.605606258148631,\n \"acc_stderr\": 0.012482141665631176,\n \"acc_norm\": 0.605606258148631,\n \"acc_norm_stderr\": 0.012482141665631176\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8125,\n \"acc_stderr\": 0.023709788253811766,\n \"acc_norm\": 0.8125,\n \"acc_norm_stderr\": 0.023709788253811766\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.8235294117647058,\n \"acc_stderr\": 0.015422512066262554,\n \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.015422512066262554\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.7181818181818181,\n \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8367346938775511,\n \"acc_stderr\": 0.023661699177098608,\n \"acc_norm\": 0.8367346938775511,\n \"acc_norm_stderr\": 0.023661699177098608\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8855721393034826,\n \"acc_stderr\": 0.022509345325101706,\n \"acc_norm\": 0.8855721393034826,\n \"acc_norm_stderr\": 0.022509345325101706\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466125,\n \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466125\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5783132530120482,\n \"acc_stderr\": 0.038444531817709175,\n \"acc_norm\": 0.5783132530120482,\n \"acc_norm_stderr\": 0.038444531817709175\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8771929824561403,\n \"acc_stderr\": 0.02517298435015578,\n \"acc_norm\": 0.8771929824561403,\n \"acc_norm_stderr\": 0.02517298435015578\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.41982864137086906,\n \"mc1_stderr\": 0.01727703030177577,\n \"mc2\": 0.5808164969122677,\n \"mc2_stderr\": 0.014977589951125109\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8334648776637726,\n \"acc_stderr\": 0.010470796496781103\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6952236542835482,\n \"acc_stderr\": 0.012679297549515425\n }\n}\n```", "repo_url": "https://huggingface.co/ibndias/Nous-Hermes-2-MoE-2x34B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|arc:challenge|25_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|gsm8k|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hellaswag|10_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|winogrande|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["results_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["results_2024-01-14T00-41-59.190674.parquet"]}]}]} | 2024-01-14T00:44:34+00:00 |
8636d32b4f17579828aa0bf3e965fb6ed8aa705e | chambers5710/tvc_feature_release | [
"license:apache-2.0",
"region:us"
] | 2024-01-14T00:56:10+00:00 | {"license": "apache-2.0"} | 2024-01-14T00:56:10+00:00 |
|
0df2fdcf6a51f7f4adfad2553b10a5c5c5c8cad9 | # TLDR
* wikipedia page: [Road signs in Malaysia](https://en.wikipedia.org/wiki/Road_signs_in_Malaysia)
* num. of images: 365
* contributed to: https://github.com/orgs/malaysia-ai/projects/9/views/1?pane=issue&itemId=43619647
* date scraped: 14th January 2024 | wanadzhar913/wikipedia-malaysian-road-sign-images | [
"license:apache-2.0",
"region:us"
] | 2024-01-14T01:01:06+00:00 | {"license": "apache-2.0"} | 2024-01-14T01:20:28+00:00 |
61979f1e3f7e944d4c83d43c4cfabc985fd5cb3a | llm-aes/asappp-3-6-original | [
"region:us"
] | 2024-01-14T01:05:56+00:00 | {"dataset_info": {"features": [{"name": "Essay_ID", "dtype": "int64"}, {"name": "essay_set", "dtype": "int64"}, {"name": "essay", "dtype": "string"}, {"name": "rater1_domain1", "dtype": "int64"}, {"name": "rater2_domain1", "dtype": "int64"}, {"name": "domain1_score", "dtype": "int64"}, {"name": "rubrics", "dtype": "string"}, {"name": "prompt", "dtype": "string"}, {"name": "Content", "dtype": "int64"}, {"name": "Prompt_Adherence", "dtype": "int64"}, {"name": "Language", "dtype": "int64"}, {"name": "Narrativity", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 60382165, "num_examples": 7101}], "download_size": 2445084, "dataset_size": 60382165}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-14T08:20:22+00:00 |
|
b52abf8b1f4dee4cc46f3c1e41fd54a953963286 |
# Dataset of ting_an/定安/定安 (Azur Lane)
This is the dataset of ting_an/定安/定安 (Azur Lane), containing 40 images and their tags.
The core tags of this character are `breasts, earrings, bangs, black_hair, large_breasts, long_hair, mole, mole_under_eye, huge_breasts, purple_eyes, hair_ornament, pink_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 40 | 58.19 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ting_an_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 40 | 29.49 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ting_an_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 93 | 63.66 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ting_an_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 40 | 49.25 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ting_an_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 93 | 94.46 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ting_an_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/ting_an_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 7 |  |  |  |  |  | 1girl, bare_shoulders, blush, chinese_clothes, cleavage, covered_navel, jewelry, looking_at_viewer, solo, cameltoe, curvy, parted_lips, thick_thighs, dress, hair_over_shoulder, revealing_clothes, smile, braid, leotard, mature_female, pelvic_curtain, see-through, sideboob, hand_up, indoors, plump, pussy_juice |
| 1 | 5 |  |  |  |  |  | 1girl, bare_shoulders, breast_curtains, jewelry, looking_at_viewer, open_mouth, solo, blush, cleavage, covered_navel, fur_trim, pelvic_curtain, revealing_clothes, see-through, sideboob, china_dress, cowboy_shot, parted_bangs, smile, thighs, white_thighhighs, covered_nipples, detached_sleeves, hair_over_shoulder, mole_on_breast, underwear |
| 2 | 7 |  |  |  |  |  | 1girl, blush, 1boy, jewelry, solo_focus, cum_on_breasts, open_mouth, censored, heart, nipples, nude, sweat, symbol-shaped_pupils, bare_shoulders, breast_grab, breasts_squeezed_together, cleavage, cum_on_hair, facial, grabbing, looking_at_viewer, on_back, paizuri_under_clothes, penis, smile |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | blush | chinese_clothes | cleavage | covered_navel | jewelry | looking_at_viewer | solo | cameltoe | curvy | parted_lips | thick_thighs | dress | hair_over_shoulder | revealing_clothes | smile | braid | leotard | mature_female | pelvic_curtain | see-through | sideboob | hand_up | indoors | plump | pussy_juice | breast_curtains | open_mouth | fur_trim | china_dress | cowboy_shot | parted_bangs | thighs | white_thighhighs | covered_nipples | detached_sleeves | mole_on_breast | underwear | 1boy | solo_focus | cum_on_breasts | censored | heart | nipples | nude | sweat | symbol-shaped_pupils | breast_grab | breasts_squeezed_together | cum_on_hair | facial | grabbing | on_back | paizuri_under_clothes | penis |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:--------|:------------------|:-----------|:----------------|:----------|:--------------------|:-------|:-----------|:--------|:--------------|:---------------|:--------|:---------------------|:--------------------|:--------|:--------|:----------|:----------------|:-----------------|:--------------|:-----------|:----------|:----------|:--------|:--------------|:------------------|:-------------|:-----------|:--------------|:--------------|:---------------|:---------|:-------------------|:------------------|:-------------------|:-----------------|:------------|:-------|:-------------|:-----------------|:-----------|:--------|:----------|:-------|:--------|:-----------------------|:--------------|:----------------------------|:--------------|:---------|:-----------|:----------|:------------------------|:--------|
| 0 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | | X | X | X | X | X | | | | | | X | X | X | | | | X | X | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | |
| 2 | 7 |  |  |  |  |  | X | X | X | | X | | X | X | | | | | | | | | X | | | | | | | | | | | | X | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/ting_an_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T01:07:28+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T01:17:29+00:00 |
88dc31c15afe1acab837aa8816c384291ee0b6fc |
# Dataset of marseillaise/マルセイエーズ/马赛曲 (Azur Lane)
This is the dataset of marseillaise/マルセイエーズ/马赛曲 (Azur Lane), containing 23 images and their tags.
The core tags of this character are `breasts, long_hair, red_eyes, large_breasts, bangs, white_hair, very_long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 23 | 52.51 MiB | [Download](https://huggingface.co/datasets/CyberHarem/marseillaise_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 23 | 21.14 MiB | [Download](https://huggingface.co/datasets/CyberHarem/marseillaise_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 63 | 49.55 MiB | [Download](https://huggingface.co/datasets/CyberHarem/marseillaise_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 23 | 43.19 MiB | [Download](https://huggingface.co/datasets/CyberHarem/marseillaise_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 63 | 83.50 MiB | [Download](https://huggingface.co/datasets/CyberHarem/marseillaise_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/marseillaise_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 8 |  |  |  |  |  | 1girl, solo, blush, cleavage, detached_sleeves, looking_at_viewer, white_dress, black_thighhighs, navel, black_gloves, closed_mouth, hair_ornament, thighs, horns, smile, cowboy_shot, long_sleeves, panties, simple_background, white_background |
| 1 | 6 |  |  |  |  |  | 1girl, black_pants, looking_at_viewer, solo, sports_bra, yoga_pants, ass, bare_shoulders, blush, no_shoes, sweat, sitting, closed_mouth, grey_hair, looking_back, white_socks |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | blush | cleavage | detached_sleeves | looking_at_viewer | white_dress | black_thighhighs | navel | black_gloves | closed_mouth | hair_ornament | thighs | horns | smile | cowboy_shot | long_sleeves | panties | simple_background | white_background | black_pants | sports_bra | yoga_pants | ass | bare_shoulders | no_shoes | sweat | sitting | grey_hair | looking_back | white_socks |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:-----------|:-------------------|:--------------------|:--------------|:-------------------|:--------|:---------------|:---------------|:----------------|:---------|:--------|:--------|:--------------|:---------------|:----------|:--------------------|:-------------------|:--------------|:-------------|:-------------|:------|:-----------------|:-----------|:--------|:----------|:------------|:---------------|:--------------|
| 0 | 8 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | X | | | X | | | | | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/marseillaise_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T01:07:44+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T01:16:12+00:00 |
4c40149199069b8aaf36738e227397d8d1899765 |
# Dataset of aulick/オーリック/奥利克 (Azur Lane)
This is the dataset of aulick/オーリック/奥利克 (Azur Lane), containing 10 images and their tags.
The core tags of this character are `hair_ornament, hairclip, short_hair, hat, beret, bangs, green_eyes, hair_between_eyes, red_hair, sailor_hat, white_headwear`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 10 | 7.40 MiB | [Download](https://huggingface.co/datasets/CyberHarem/aulick_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 10 | 5.05 MiB | [Download](https://huggingface.co/datasets/CyberHarem/aulick_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 20 | 9.35 MiB | [Download](https://huggingface.co/datasets/CyberHarem/aulick_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 10 | 7.07 MiB | [Download](https://huggingface.co/datasets/CyberHarem/aulick_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 20 | 12.96 MiB | [Download](https://huggingface.co/datasets/CyberHarem/aulick_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/aulick_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 |  |  |  |  |  | 1girl, blush, solo, open_mouth, sailor_collar, looking_at_viewer, sailor_dress, white_gloves, yellow_neckerchief, :d, simple_background, sleeveless_dress, white_background, white_thighhighs, blue_dress, feathers, frilled_dress, hat_feather, holding |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blush | solo | open_mouth | sailor_collar | looking_at_viewer | sailor_dress | white_gloves | yellow_neckerchief | :d | simple_background | sleeveless_dress | white_background | white_thighhighs | blue_dress | feathers | frilled_dress | hat_feather | holding |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:-------|:-------------|:----------------|:--------------------|:---------------|:---------------|:---------------------|:-----|:--------------------|:-------------------|:-------------------|:-------------------|:-------------|:-----------|:----------------|:--------------|:----------|
| 0 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/aulick_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T01:07:45+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T01:11:10+00:00 |
7191ed3d818fd4ba22c2221211da5c41106cafe5 |
# Dataset of hans_ludemann/ハンス・リューデマン/Z18 (Azur Lane)
This is the dataset of hans_ludemann/ハンス・リューデマン/Z18 (Azur Lane), containing 22 images and their tags.
The core tags of this character are `blonde_hair, long_hair, twintails, blue_eyes, hair_ornament, hairclip, hat, bow, fang, breasts, hair_between_eyes, small_breasts, bangs, very_long_hair, black_headwear`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 22 | 29.75 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hans_ludemann_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 22 | 17.76 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hans_ludemann_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 58 | 39.43 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hans_ludemann_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 22 | 27.03 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hans_ludemann_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 58 | 54.35 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hans_ludemann_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/hans_ludemann_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 22 |  |  |  |  |  | blush, 1girl, solo, looking_at_viewer, navel, open_mouth, fingerless_gloves, smile, black_gloves, skirt, white_panties, black_thighhighs, jacket, open_clothes, training_bra |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | blush | 1girl | solo | looking_at_viewer | navel | open_mouth | fingerless_gloves | smile | black_gloves | skirt | white_panties | black_thighhighs | jacket | open_clothes | training_bra |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:-------|:--------------------|:--------|:-------------|:--------------------|:--------|:---------------|:--------|:----------------|:-------------------|:---------|:---------------|:---------------|
| 0 | 22 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/hans_ludemann_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T01:07:50+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T01:29:33+00:00 |
b9bdd6281ebe5a55dfff6352bd03633d537829d4 |
# Dataset of oyashio/親潮/亲潮 (Azur Lane)
This is the dataset of oyashio/親潮/亲潮 (Azur Lane), containing 12 images and their tags.
The core tags of this character are `hair_ornament, hair_bun, x_hair_ornament, braid, bangs, fang, hair_between_eyes, horns, double_bun, blonde_hair, blue_eyes, pointy_ears, sidelocks, breasts, brown_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 12 | 14.16 MiB | [Download](https://huggingface.co/datasets/CyberHarem/oyashio_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 12 | 8.40 MiB | [Download](https://huggingface.co/datasets/CyberHarem/oyashio_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 28 | 17.77 MiB | [Download](https://huggingface.co/datasets/CyberHarem/oyashio_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 12 | 12.61 MiB | [Download](https://huggingface.co/datasets/CyberHarem/oyashio_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 28 | 24.90 MiB | [Download](https://huggingface.co/datasets/CyberHarem/oyashio_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/oyashio_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, bare_shoulders, blush, detached_sleeves, japanese_clothes, long_sleeves, looking_at_viewer, open_mouth, simple_background, solo, white_background, wide_sleeves, :d, black_gloves, black_skirt, pleated_skirt, single_thighhigh, sleeveless, standing, uneven_legwear, full_body, partially_fingerless_gloves, shirt, side-tie_panties, single_kneehigh, black_footwear, bridal_gauntlets, crossed_bangs, green_eyes, index_finger_raised, jewelry, legs_apart, long_hair, machinery, magatama, minigirl, miniskirt, mismatched_legwear, oni_horns, pigeon-toed, sash, side_slit, single_glove, single_hair_bun, small_breasts, torpedo_tubes, turret, white_sleeves, zettai_ryouiki |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | blush | detached_sleeves | japanese_clothes | long_sleeves | looking_at_viewer | open_mouth | simple_background | solo | white_background | wide_sleeves | :d | black_gloves | black_skirt | pleated_skirt | single_thighhigh | sleeveless | standing | uneven_legwear | full_body | partially_fingerless_gloves | shirt | side-tie_panties | single_kneehigh | black_footwear | bridal_gauntlets | crossed_bangs | green_eyes | index_finger_raised | jewelry | legs_apart | long_hair | machinery | magatama | minigirl | miniskirt | mismatched_legwear | oni_horns | pigeon-toed | sash | side_slit | single_glove | single_hair_bun | small_breasts | torpedo_tubes | turret | white_sleeves | zettai_ryouiki |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:--------|:-------------------|:-------------------|:---------------|:--------------------|:-------------|:--------------------|:-------|:-------------------|:---------------|:-----|:---------------|:--------------|:----------------|:-------------------|:-------------|:-----------|:-----------------|:------------|:------------------------------|:--------|:-------------------|:------------------|:-----------------|:-------------------|:----------------|:-------------|:----------------------|:----------|:-------------|:------------|:------------|:-----------|:-----------|:------------|:---------------------|:------------|:--------------|:-------|:------------|:---------------|:------------------|:----------------|:----------------|:---------|:----------------|:-----------------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/oyashio_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T01:08:34+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T01:11:23+00:00 |
9c6a8408845962ee5f63b19162466fbb69d7535e |
# Dataset of mary_celeste/メアリー・セレスト/玛丽·西莱斯特号 (Azur Lane)
This is the dataset of mary_celeste/メアリー・セレスト/玛丽·西莱斯特号 (Azur Lane), containing 27 images and their tags.
The core tags of this character are `blue_hair, breasts, horns, large_breasts, long_hair, pointy_ears, blue_eyes, bangs, hair_between_eyes, very_long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 27 | 58.32 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mary_celeste_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 27 | 25.09 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mary_celeste_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 68 | 54.69 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mary_celeste_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 27 | 46.18 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mary_celeste_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 68 | 86.45 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mary_celeste_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/mary_celeste_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 12 |  |  |  |  |  | 1girl, long_sleeves, navel, solo, looking_at_viewer, smile, torn_clothes, belt, black_coat, open_mouth, open_coat, tentacles, thighs, barefoot, blush, revealing_clothes, sitting, stomach, fang, water |
| 1 | 8 |  |  |  |  |  | 1girl, looking_at_viewer, solo, black_dress, wings, covered_navel, parted_lips, bare_shoulders, barefoot, holding, sleeveless_dress, underboob_cutout, earrings, full_body, sideboob |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | long_sleeves | navel | solo | looking_at_viewer | smile | torn_clothes | belt | black_coat | open_mouth | open_coat | tentacles | thighs | barefoot | blush | revealing_clothes | sitting | stomach | fang | water | black_dress | wings | covered_navel | parted_lips | bare_shoulders | holding | sleeveless_dress | underboob_cutout | earrings | full_body | sideboob |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:--------|:-------|:--------------------|:--------|:---------------|:-------|:-------------|:-------------|:------------|:------------|:---------|:-----------|:--------|:--------------------|:----------|:----------|:-------|:--------|:--------------|:--------|:----------------|:--------------|:-----------------|:----------|:-------------------|:-------------------|:-----------|:------------|:-----------|
| 0 | 12 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | |
| 1 | 8 |  |  |  |  |  | X | | | X | X | | | | | | | | | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/mary_celeste_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T01:08:43+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T01:17:30+00:00 |
f93286104328116a0382df20e6b8a42e09e04ab4 |
# Dataset Card for Evaluation run of macadeliccc/laser-dolphin-mixtral-2x7b-dpo
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [macadeliccc/laser-dolphin-mixtral-2x7b-dpo](https://huggingface.co/macadeliccc/laser-dolphin-mixtral-2x7b-dpo) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_macadeliccc__laser-dolphin-mixtral-2x7b-dpo",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T01:13:57.359475](https://huggingface.co/datasets/open-llm-leaderboard/details_macadeliccc__laser-dolphin-mixtral-2x7b-dpo/blob/main/results_2024-01-14T01-13-57.359475.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6323249282667325,
"acc_stderr": 0.03235123186693868,
"acc_norm": 0.63602882598941,
"acc_norm_stderr": 0.03299471578731984,
"mc1": 0.4418604651162791,
"mc1_stderr": 0.01738476747898622,
"mc2": 0.6075861082832835,
"mc2_stderr": 0.015099206529299735
},
"harness|arc:challenge|25": {
"acc": 0.6245733788395904,
"acc_stderr": 0.014150631435111728,
"acc_norm": 0.659556313993174,
"acc_norm_stderr": 0.013847460518892978
},
"harness|hellaswag|10": {
"acc": 0.6661023700458076,
"acc_stderr": 0.004706398252382464,
"acc_norm": 0.8579964150567616,
"acc_norm_stderr": 0.0034834044902359936
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595852,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595852
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6754716981132075,
"acc_stderr": 0.02881561571343211,
"acc_norm": 0.6754716981132075,
"acc_norm_stderr": 0.02881561571343211
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7361111111111112,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.7361111111111112,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6127167630057804,
"acc_stderr": 0.03714325906302065,
"acc_norm": 0.6127167630057804,
"acc_norm_stderr": 0.03714325906302065
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932263,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932263
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.502127659574468,
"acc_stderr": 0.03268572658667492,
"acc_norm": 0.502127659574468,
"acc_norm_stderr": 0.03268572658667492
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.43859649122807015,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.43859649122807015,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.593103448275862,
"acc_stderr": 0.04093793981266236,
"acc_norm": 0.593103448275862,
"acc_norm_stderr": 0.04093793981266236
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42328042328042326,
"acc_stderr": 0.02544636563440679,
"acc_norm": 0.42328042328042326,
"acc_norm_stderr": 0.02544636563440679
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7645161290322581,
"acc_stderr": 0.02413763242933771,
"acc_norm": 0.7645161290322581,
"acc_norm_stderr": 0.02413763242933771
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.02886977846026704,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.02886977846026704
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.023381935348121437,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.023381935348121437
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6307692307692307,
"acc_stderr": 0.024468615241478923,
"acc_norm": 0.6307692307692307,
"acc_norm_stderr": 0.024468615241478923
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.28888888888888886,
"acc_stderr": 0.027634907264178544,
"acc_norm": 0.28888888888888886,
"acc_norm_stderr": 0.027634907264178544
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.03038835355188679,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.03038835355188679
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8275229357798165,
"acc_stderr": 0.016197807956848033,
"acc_norm": 0.8275229357798165,
"acc_norm_stderr": 0.016197807956848033
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.027865942286639325,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.027865942286639325
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7805907172995781,
"acc_stderr": 0.026939106581553945,
"acc_norm": 0.7805907172995781,
"acc_norm_stderr": 0.026939106581553945
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.031493846709941306,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.031493846709941306
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.020930193185179333,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.020930193185179333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8122605363984674,
"acc_stderr": 0.013964393769899129,
"acc_norm": 0.8122605363984674,
"acc_norm_stderr": 0.013964393769899129
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7254335260115607,
"acc_stderr": 0.024027745155265012,
"acc_norm": 0.7254335260115607,
"acc_norm_stderr": 0.024027745155265012
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3664804469273743,
"acc_stderr": 0.016115235504865467,
"acc_norm": 0.3664804469273743,
"acc_norm_stderr": 0.016115235504865467
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7189542483660131,
"acc_stderr": 0.025738854797818733,
"acc_norm": 0.7189542483660131,
"acc_norm_stderr": 0.025738854797818733
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6913183279742765,
"acc_stderr": 0.026236965881153266,
"acc_norm": 0.6913183279742765,
"acc_norm_stderr": 0.026236965881153266
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.02438366553103545,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.02438366553103545
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.450354609929078,
"acc_stderr": 0.029680105565029036,
"acc_norm": 0.450354609929078,
"acc_norm_stderr": 0.029680105565029036
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44654498044328556,
"acc_stderr": 0.012697046024399682,
"acc_norm": 0.44654498044328556,
"acc_norm_stderr": 0.012697046024399682
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6507352941176471,
"acc_stderr": 0.028959755196824876,
"acc_norm": 0.6507352941176471,
"acc_norm_stderr": 0.028959755196824876
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6568627450980392,
"acc_stderr": 0.019206606848825362,
"acc_norm": 0.6568627450980392,
"acc_norm_stderr": 0.019206606848825362
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.746938775510204,
"acc_stderr": 0.027833023871399677,
"acc_norm": 0.746938775510204,
"acc_norm_stderr": 0.027833023871399677
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.03061111655743253,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.03061111655743253
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4418604651162791,
"mc1_stderr": 0.01738476747898622,
"mc2": 0.6075861082832835,
"mc2_stderr": 0.015099206529299735
},
"harness|winogrande|5": {
"acc": 0.7900552486187845,
"acc_stderr": 0.01144628062926263
},
"harness|gsm8k|5": {
"acc": 0.4829416224412434,
"acc_stderr": 0.013764467123761318
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_macadeliccc__laser-dolphin-mixtral-2x7b-dpo | [
"region:us"
] | 2024-01-14T01:16:11+00:00 | {"pretty_name": "Evaluation run of macadeliccc/laser-dolphin-mixtral-2x7b-dpo", "dataset_summary": "Dataset automatically created during the evaluation run of model [macadeliccc/laser-dolphin-mixtral-2x7b-dpo](https://huggingface.co/macadeliccc/laser-dolphin-mixtral-2x7b-dpo) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_macadeliccc__laser-dolphin-mixtral-2x7b-dpo\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-14T01:13:57.359475](https://huggingface.co/datasets/open-llm-leaderboard/details_macadeliccc__laser-dolphin-mixtral-2x7b-dpo/blob/main/results_2024-01-14T01-13-57.359475.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6323249282667325,\n \"acc_stderr\": 0.03235123186693868,\n \"acc_norm\": 0.63602882598941,\n \"acc_norm_stderr\": 0.03299471578731984,\n \"mc1\": 0.4418604651162791,\n \"mc1_stderr\": 0.01738476747898622,\n \"mc2\": 0.6075861082832835,\n \"mc2_stderr\": 0.015099206529299735\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6245733788395904,\n \"acc_stderr\": 0.014150631435111728,\n \"acc_norm\": 0.659556313993174,\n \"acc_norm_stderr\": 0.013847460518892978\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6661023700458076,\n \"acc_stderr\": 0.004706398252382464,\n \"acc_norm\": 0.8579964150567616,\n \"acc_norm_stderr\": 0.0034834044902359936\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n \"acc_stderr\": 0.04188307537595852,\n \"acc_norm\": 0.6222222222222222,\n \"acc_norm_stderr\": 0.04188307537595852\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6754716981132075,\n \"acc_stderr\": 0.02881561571343211,\n \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.02881561571343211\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7361111111111112,\n \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.7361111111111112,\n \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6127167630057804,\n \"acc_stderr\": 0.03714325906302065,\n \"acc_norm\": 0.6127167630057804,\n \"acc_norm_stderr\": 0.03714325906302065\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932263\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.502127659574468,\n \"acc_stderr\": 0.03268572658667492,\n \"acc_norm\": 0.502127659574468,\n \"acc_norm_stderr\": 0.03268572658667492\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.593103448275862,\n \"acc_stderr\": 0.04093793981266236,\n \"acc_norm\": 0.593103448275862,\n \"acc_norm_stderr\": 0.04093793981266236\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42328042328042326,\n \"acc_stderr\": 0.02544636563440679,\n \"acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.02544636563440679\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7645161290322581,\n \"acc_stderr\": 0.02413763242933771,\n \"acc_norm\": 0.7645161290322581,\n \"acc_norm_stderr\": 0.02413763242933771\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7929292929292929,\n \"acc_stderr\": 0.02886977846026704,\n \"acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.02886977846026704\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.023381935348121437,\n \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.023381935348121437\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6307692307692307,\n \"acc_stderr\": 0.024468615241478923,\n \"acc_norm\": 0.6307692307692307,\n \"acc_norm_stderr\": 0.024468615241478923\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.28888888888888886,\n \"acc_stderr\": 0.027634907264178544,\n \"acc_norm\": 0.28888888888888886,\n \"acc_norm_stderr\": 0.027634907264178544\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.03038835355188679,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.03038835355188679\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8275229357798165,\n \"acc_stderr\": 0.016197807956848033,\n \"acc_norm\": 0.8275229357798165,\n \"acc_norm_stderr\": 0.016197807956848033\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.803921568627451,\n \"acc_stderr\": 0.027865942286639325,\n \"acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.027865942286639325\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7805907172995781,\n \"acc_stderr\": 0.026939106581553945,\n \"acc_norm\": 0.7805907172995781,\n \"acc_norm_stderr\": 0.026939106581553945\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n \"acc_stderr\": 0.031493846709941306,\n \"acc_norm\": 0.672645739910314,\n \"acc_norm_stderr\": 0.031493846709941306\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.020930193185179333,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.020930193185179333\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8122605363984674,\n \"acc_stderr\": 0.013964393769899129,\n \"acc_norm\": 0.8122605363984674,\n \"acc_norm_stderr\": 0.013964393769899129\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7254335260115607,\n \"acc_stderr\": 0.024027745155265012,\n \"acc_norm\": 0.7254335260115607,\n \"acc_norm_stderr\": 0.024027745155265012\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3664804469273743,\n \"acc_stderr\": 0.016115235504865467,\n \"acc_norm\": 0.3664804469273743,\n \"acc_norm_stderr\": 0.016115235504865467\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.025738854797818733,\n \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.025738854797818733\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6913183279742765,\n \"acc_stderr\": 0.026236965881153266,\n \"acc_norm\": 0.6913183279742765,\n \"acc_norm_stderr\": 0.026236965881153266\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.02438366553103545,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.02438366553103545\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.450354609929078,\n \"acc_stderr\": 0.029680105565029036,\n \"acc_norm\": 0.450354609929078,\n \"acc_norm_stderr\": 0.029680105565029036\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44654498044328556,\n \"acc_stderr\": 0.012697046024399682,\n \"acc_norm\": 0.44654498044328556,\n \"acc_norm_stderr\": 0.012697046024399682\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6507352941176471,\n \"acc_stderr\": 0.028959755196824876,\n \"acc_norm\": 0.6507352941176471,\n \"acc_norm_stderr\": 0.028959755196824876\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6568627450980392,\n \"acc_stderr\": 0.019206606848825362,\n \"acc_norm\": 0.6568627450980392,\n \"acc_norm_stderr\": 0.019206606848825362\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.746938775510204,\n \"acc_stderr\": 0.027833023871399677,\n \"acc_norm\": 0.746938775510204,\n \"acc_norm_stderr\": 0.027833023871399677\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.03061111655743253,\n \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.03061111655743253\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4418604651162791,\n \"mc1_stderr\": 0.01738476747898622,\n \"mc2\": 0.6075861082832835,\n \"mc2_stderr\": 0.015099206529299735\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7900552486187845,\n \"acc_stderr\": 0.01144628062926263\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4829416224412434,\n \"acc_stderr\": 0.013764467123761318\n }\n}\n```", "repo_url": "https://huggingface.co/macadeliccc/laser-dolphin-mixtral-2x7b-dpo", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|arc:challenge|25_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|gsm8k|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hellaswag|10_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|winogrande|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["results_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["results_2024-01-14T01-13-57.359475.parquet"]}]}]} | 2024-01-14T01:16:33+00:00 |
5269531984378d65735f62af11cabe5ec9fc68dc |
# Dataset of ns2000/NS2000/NS2000 (Girls' Frontline)
This is the dataset of ns2000/NS2000/NS2000 (Girls' Frontline), containing 13 images and their tags.
The core tags of this character are `animal_ears, breasts, dark-skinned_female, dark_skin, rabbit_ears, red_eyes, large_breasts, long_hair, white_hair, bangs, grey_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 13 | 13.49 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ns2000_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 13 | 8.37 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ns2000_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 28 | 16.43 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ns2000_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 13 | 12.33 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ns2000_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 28 | 21.36 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ns2000_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/ns2000_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------|
| 0 | 13 |  |  |  |  |  | 1girl, solo, navel, cleavage, looking_at_viewer, simple_background, open_mouth, smile, white_background, blush, gloves, shorts |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | navel | cleavage | looking_at_viewer | simple_background | open_mouth | smile | white_background | blush | gloves | shorts |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:-----------|:--------------------|:--------------------|:-------------|:--------|:-------------------|:--------|:---------|:---------|
| 0 | 13 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/ns2000_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T01:18:09+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T01:20:32+00:00 |
Subsets and Splits
No saved queries yet
Save your SQL queries to embed, download, and access them later. Queries will appear here once saved.