sha
stringlengths 40
40
| text
stringlengths 0
13.4M
| id
stringlengths 2
117
| tags
sequence | created_at
stringlengths 25
25
| metadata
stringlengths 2
31.7M
| last_modified
stringlengths 25
25
|
---|---|---|---|---|---|---|
3329766e69db0843f03f56d6c1a2a0d2b183ca29 | Danielouo/mini-platypus | [
"region:us"
] | 2024-01-17T13:05:32+00:00 | {"dataset_info": {"features": [{"name": "instruction", "dtype": "string"}, {"name": "output", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 4307372, "num_examples": 1000}], "download_size": 2283061, "dataset_size": 4307372}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-17T13:17:50+00:00 |
|
e8661dd21fc838cd9c38b4bf5ace22dc8cf46566 |
# Dataset of velour/ベロア (Fire Emblem)
This is the dataset of velour/ベロア (Fire Emblem), containing 184 images and their tags.
The core tags of this character are `animal_ears, multicolored_hair, grey_hair, wolf_ears, long_hair, red_eyes, black_hair, tail, wolf_tail, streaked_hair, two-tone_hair, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 184 | 195.24 MiB | [Download](https://huggingface.co/datasets/CyberHarem/velour_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 184 | 119.52 MiB | [Download](https://huggingface.co/datasets/CyberHarem/velour_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 418 | 245.18 MiB | [Download](https://huggingface.co/datasets/CyberHarem/velour_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 184 | 177.54 MiB | [Download](https://huggingface.co/datasets/CyberHarem/velour_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 418 | 331.48 MiB | [Download](https://huggingface.co/datasets/CyberHarem/velour_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/velour_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 20 |  |  |  |  |  | 1girl, solo, gloves, simple_background, white_background, boots, hood_up |
| 1 | 7 |  |  |  |  |  | 1girl, bangs, long_sleeves, looking_at_viewer, white_shirt, brown_gloves, corset, frills, hood_up, neck_ribbon, solo, belt_buckle, cape, closed_mouth, simple_background, black_ribbon, blush, brown_belt, white_background, black_pants, boots, brown_footwear, pouch, standing |
| 2 | 5 |  |  |  |  |  | 1boy, 1girl, hetero, open_mouth, solo_focus, nipples, vaginal, blush, cum_in_pussy, large_breasts, penis, sex_from_behind, completely_nude, medium_breasts, straddling, tears |
| 3 | 10 |  |  |  |  |  | hetero, 1boy, 1girl, nipples, penis, solo_focus, hood, facial, bar_censor, blush, large_breasts, open_mouth, white_hair, cum_on_breasts, bangs, cum_on_hair, paizuri, tongue_out |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | gloves | simple_background | white_background | boots | hood_up | bangs | long_sleeves | looking_at_viewer | white_shirt | brown_gloves | corset | frills | neck_ribbon | belt_buckle | cape | closed_mouth | black_ribbon | blush | brown_belt | black_pants | brown_footwear | pouch | standing | 1boy | hetero | open_mouth | solo_focus | nipples | vaginal | cum_in_pussy | large_breasts | penis | sex_from_behind | completely_nude | medium_breasts | straddling | tears | hood | facial | bar_censor | white_hair | cum_on_breasts | cum_on_hair | paizuri | tongue_out |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:---------|:--------------------|:-------------------|:--------|:----------|:--------|:---------------|:--------------------|:--------------|:---------------|:---------|:---------|:--------------|:--------------|:-------|:---------------|:---------------|:--------|:-------------|:--------------|:-----------------|:--------|:-----------|:-------|:---------|:-------------|:-------------|:----------|:----------|:---------------|:----------------|:--------|:------------------|:------------------|:-----------------|:-------------|:--------|:-------|:---------|:-------------|:-------------|:-----------------|:--------------|:----------|:-------------|
| 0 | 20 |  |  |  |  |  | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | | | | | | | | | | | | | | | | | | | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | |
| 3 | 10 |  |  |  |  |  | X | | | | | | | X | | | | | | | | | | | | X | | | | | | X | X | X | X | X | | | X | X | | | | | | X | X | X | X | X | X | X | X |
| CyberHarem/velour_fireemblem | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-17T13:08:37+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-17T13:46:14+00:00 |
957836bfd09d63d1989e5dc2a956030b1d053e30 |
# Dataset of lute/ルーテ (Fire Emblem)
This is the dataset of lute/ルーテ (Fire Emblem), containing 241 images and their tags.
The core tags of this character are `purple_hair, purple_eyes, breasts, twintails`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 241 | 249.12 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lute_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 241 | 151.93 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lute_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 493 | 280.11 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lute_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 241 | 225.13 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lute_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 493 | 374.76 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lute_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/lute_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 16 |  |  |  |  |  | 1girl, solo, cleavage, simple_background, hair_flower, holding_book, navel, long_hair, medium_breasts, white_background, bare_shoulders, looking_at_viewer, bangs, closed_mouth, purple_bikini, collarbone, full_body, sandals |
| 1 | 8 |  |  |  |  |  | 1girl, bare_shoulders, dress, solo, cape, holding_book, simple_background, white_background, low_twintails, full_body, looking_at_viewer, short_hair, smile, upper_body |
| 2 | 21 |  |  |  |  |  | 1girl, navel, nipples, solo, collarbone, small_breasts, blush, completely_nude, pussy, looking_at_viewer, holding_book, standing, bangs, mosaic_censoring, medium_hair, full_body, open_mouth |
| 3 | 11 |  |  |  |  |  | 1girl, bare_shoulders, fur_trim, hat, long_sleeves, solo, bangs, official_alternate_costume, choker, flower, looking_at_viewer, twin_braids, boots, long_hair, open_mouth, simple_background, white_dress, white_footwear, christmas, closed_mouth, collarbone, food, white_background, full_body, holding, white_headwear |
| 4 | 5 |  |  |  |  |  | 1girl, completely_nude, hetero, mosaic_censoring, multiple_penises, nipples, solo_focus, blush, navel, on_back, 3boys, collarbone, cum_on_hair, facial, gangbang, medium_breasts, small_breasts, spread_legs, sweat, 2boys, bangs, bukkake, closed_eyes, cum_in_pussy, cum_on_breasts, double_handjob, ejaculation, hand_on_another's_head, heart, leg_grab, open_mouth, rape |
| 5 | 5 |  |  |  |  |  | 1boy, 1girl, hetero, penis, solo_focus, nipples, open_mouth, sex, vaginal, blush, cum_in_pussy, medium_breasts, mosaic_censoring, nude, cowgirl_position, girl_on_top, oral |
| 6 | 5 |  |  |  |  |  | 1girl, blush, solo, tears, arms_behind_back, crotch_rope, nipples, pussy_juice, torn_clothes, white_panties, open_mouth, peeing_self, shibari_over_clothes, small_breasts, wet_panties |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | cleavage | simple_background | hair_flower | holding_book | navel | long_hair | medium_breasts | white_background | bare_shoulders | looking_at_viewer | bangs | closed_mouth | purple_bikini | collarbone | full_body | sandals | dress | cape | low_twintails | short_hair | smile | upper_body | nipples | small_breasts | blush | completely_nude | pussy | standing | mosaic_censoring | medium_hair | open_mouth | fur_trim | hat | long_sleeves | official_alternate_costume | choker | flower | twin_braids | boots | white_dress | white_footwear | christmas | food | holding | white_headwear | hetero | multiple_penises | solo_focus | on_back | 3boys | cum_on_hair | facial | gangbang | spread_legs | sweat | 2boys | bukkake | closed_eyes | cum_in_pussy | cum_on_breasts | double_handjob | ejaculation | hand_on_another's_head | heart | leg_grab | rape | 1boy | penis | sex | vaginal | nude | cowgirl_position | girl_on_top | oral | tears | arms_behind_back | crotch_rope | pussy_juice | torn_clothes | white_panties | peeing_self | shibari_over_clothes | wet_panties |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:-----------|:--------------------|:--------------|:---------------|:--------|:------------|:-----------------|:-------------------|:-----------------|:--------------------|:--------|:---------------|:----------------|:-------------|:------------|:----------|:--------|:-------|:----------------|:-------------|:--------|:-------------|:----------|:----------------|:--------|:------------------|:--------|:-----------|:-------------------|:--------------|:-------------|:-----------|:------|:---------------|:-----------------------------|:---------|:---------|:--------------|:--------|:--------------|:-----------------|:------------|:-------|:----------|:-----------------|:---------|:-------------------|:-------------|:----------|:--------|:--------------|:---------|:-----------|:--------------|:--------|:--------|:----------|:--------------|:---------------|:-----------------|:-----------------|:--------------|:-------------------------|:--------|:-----------|:-------|:-------|:--------|:------|:----------|:-------|:-------------------|:--------------|:-------|:--------|:-------------------|:--------------|:--------------|:---------------|:----------------|:--------------|:-----------------------|:--------------|
| 0 | 16 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 8 |  |  |  |  |  | X | X | | X | | X | | | | X | X | X | | | | | X | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 21 |  |  |  |  |  | X | X | | | | X | X | | | | | X | X | | | X | X | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 11 |  |  |  |  |  | X | X | | X | | | | X | | X | X | X | X | X | | X | X | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | | | | | | X | | X | | | | X | | | X | | | | | | | | | X | X | X | X | | | X | | X | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | | | | | | | | X | | | | | | | | | | | | | | | | X | | X | | | | X | | X | | | | | | | | | | | | | | | X | | X | | | | | | | | | | | X | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | |
| 6 | 5 |  |  |  |  |  | X | X | | | | | | | | | | | | | | | | | | | | | | | X | X | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X |
| CyberHarem/lute_fireemblem | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-17T13:08:44+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-17T13:51:26+00:00 |
0e422be8475fc97c8e1113bdb79cebdbfe5acb4e | # LIMA数据集中文版
## 数据集构造方法
- 使用GPT-4-Turbo将原始LIMA数据集的问题部分翻译为中文。
- 使用GPT-4-Turbo回答翻译后的问题。
- 注意,本数据集不包含原始LIMA数据集的多轮问答部分 | Jellyfish042/Chinese-LIMA-V0 | [
"language:zh",
"license:mit",
"region:us"
] | 2024-01-17T13:08:52+00:00 | {"language": ["zh"], "license": "mit", "dataset_info": {"features": [{"name": "User", "dtype": "string"}, {"name": "Assistant", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1425302, "num_examples": 1000}], "download_size": 934484, "dataset_size": 1425302}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-17T13:14:46+00:00 |
96f853a330994d4e89ea4ab6643fa1d3ba700f14 |
# Dataset of yunaka/ユナカ (Fire Emblem)
This is the dataset of yunaka/ユナカ (Fire Emblem), containing 285 images and their tags.
The core tags of this character are `long_hair, red_hair, breasts, red_eyes, large_breasts, bangs, hair_ornament`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 285 | 454.17 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yunaka_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 285 | 225.88 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yunaka_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 714 | 504.06 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yunaka_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 285 | 386.32 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yunaka_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 714 | 783.42 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yunaka_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/yunaka_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 |  |  |  |  |  | 1girl, looking_at_viewer, smile, solo, star_(symbol), white_shirt, blush, collared_shirt, black_skirt, simple_background, tattoo, white_background, medium_breasts, open_mouth |
| 1 | 11 |  |  |  |  |  | 1girl, cleavage, looking_at_viewer, simple_background, solo, star_(symbol), white_background, open_mouth, cape, choker, facial_mark, one_eye_closed, upper_body, blush, :d, ;d |
| 2 | 7 |  |  |  |  |  | 1girl, christmas, gloves, looking_at_viewer, santa_hat, smile, solo, star_(symbol), cleavage, santa_costume, bell, open_mouth, blush, candy_cane, holding, official_alternate_costume, one_eye_closed, cape, fur_trim, medium_breasts, sack |
| 3 | 16 |  |  |  |  |  | 1girl, looking_at_viewer, solo, cape, holding_weapon, cleavage, bodysuit, smile, dagger, holding_knife, white_background, simple_background, one_eye_closed, open_mouth, star_hair_ornament |
| 4 | 7 |  |  |  |  |  | 1boy, 1girl, hetero, solo_focus, star_(symbol), nipples, open_mouth, penis, sex, tattoo, blush, nude, vaginal, facial_mark, mosaic_censoring, pussy, smile, torn_clothes, choker, collarbone, pubic_hair, spread_legs, sweat |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | smile | solo | star_(symbol) | white_shirt | blush | collared_shirt | black_skirt | simple_background | tattoo | white_background | medium_breasts | open_mouth | cleavage | cape | choker | facial_mark | one_eye_closed | upper_body | :d | ;d | christmas | gloves | santa_hat | santa_costume | bell | candy_cane | holding | official_alternate_costume | fur_trim | sack | holding_weapon | bodysuit | dagger | holding_knife | star_hair_ornament | 1boy | hetero | solo_focus | nipples | penis | sex | nude | vaginal | mosaic_censoring | pussy | torn_clothes | collarbone | pubic_hair | spread_legs | sweat |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:--------|:-------|:----------------|:--------------|:--------|:-----------------|:--------------|:--------------------|:---------|:-------------------|:-----------------|:-------------|:-----------|:-------|:---------|:--------------|:-----------------|:-------------|:-----|:-----|:------------|:---------|:------------|:----------------|:-------|:-------------|:----------|:-----------------------------|:-----------|:-------|:-----------------|:-----------|:---------|:----------------|:---------------------|:-------|:---------|:-------------|:----------|:--------|:------|:-------|:----------|:-------------------|:--------|:---------------|:-------------|:-------------|:--------------|:--------|
| 0 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 11 |  |  |  |  |  | X | X | | X | X | | X | | | X | | X | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 7 |  |  |  |  |  | X | X | X | X | X | | X | | | | | | X | X | X | X | | | X | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | |
| 3 | 16 |  |  |  |  |  | X | X | X | X | | | | | | X | | X | | X | X | X | | | X | | | | | | | | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | |
| 4 | 7 |  |  |  |  |  | X | | X | | X | | X | | | | X | | | X | | | X | X | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/yunaka_fireemblem | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-17T13:09:05+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-17T14:05:34+00:00 |
6c9d64a01ed054a8bdcfb7309e4e5387f3f22464 |
# Dataset of hinoka/ヒノカ (Fire Emblem)
This is the dataset of hinoka/ヒノカ (Fire Emblem), containing 316 images and their tags.
The core tags of this character are `red_hair, short_hair, red_eyes, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 316 | 319.11 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hinoka_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 316 | 201.09 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hinoka_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 683 | 380.82 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hinoka_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 316 | 287.63 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hinoka_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 683 | 496.15 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hinoka_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/hinoka_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 22 |  |  |  |  |  | 1boy, 1girl, hetero, solo_focus, blush, penis, sex, open_mouth, nipples, pussy, vaginal, cum, navel, completely_nude, uncensored, ass, medium_breasts, thighhighs |
| 1 | 13 |  |  |  |  |  | 1girl, huge_breasts, muscular_female, nipples, hetero, solo_focus, thick_thighs, 1boy, ass, sex, completely_nude, antenna_hair, futanari, smile, testicles, white_background, cum, gigantic_breasts, huge_penis |
| 2 | 6 |  |  |  |  |  | 1girl, nipples, nude, solo, blush, female_pubic_hair, pussy, small_breasts, navel, simple_background, smile |
| 3 | 6 |  |  |  |  |  | 1girl, medium_breasts, nipples, nude, solo, navel, smile |
| 4 | 5 |  |  |  |  |  | 1girl, garter_straps, holding_weapon, short_dress, solo, thigh_boots, elbow_gloves, shoulder_armor, smile, zettai_ryouiki, looking_at_viewer, naginata, simple_background, full_body, grey_background, open_mouth, red_footwear, red_thighhighs, spear, white_scarf |
| 5 | 12 |  |  |  |  |  | 1girl, solo, naked_towel, blush, looking_at_viewer, ahoge, bare_shoulders, towel_on_head, bucket, cleavage, simple_background, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1boy | 1girl | hetero | solo_focus | blush | penis | sex | open_mouth | nipples | pussy | vaginal | cum | navel | completely_nude | uncensored | ass | medium_breasts | thighhighs | huge_breasts | muscular_female | thick_thighs | antenna_hair | futanari | smile | testicles | white_background | gigantic_breasts | huge_penis | nude | solo | female_pubic_hair | small_breasts | simple_background | garter_straps | holding_weapon | short_dress | thigh_boots | elbow_gloves | shoulder_armor | zettai_ryouiki | looking_at_viewer | naginata | full_body | grey_background | red_footwear | red_thighhighs | spear | white_scarf | naked_towel | ahoge | bare_shoulders | towel_on_head | bucket | cleavage |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------|:--------|:---------|:-------------|:--------|:--------|:------|:-------------|:----------|:--------|:----------|:------|:--------|:------------------|:-------------|:------|:-----------------|:-------------|:---------------|:------------------|:---------------|:---------------|:-----------|:--------|:------------|:-------------------|:-------------------|:-------------|:-------|:-------|:--------------------|:----------------|:--------------------|:----------------|:-----------------|:--------------|:--------------|:---------------|:-----------------|:-----------------|:--------------------|:-----------|:------------|:------------------|:---------------|:-----------------|:--------|:--------------|:--------------|:--------|:-----------------|:----------------|:---------|:-----------|
| 0 | 22 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 13 |  |  |  |  |  | X | X | X | X | | | X | | X | | | X | | X | | X | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | | X | | | X | | | | X | X | | | X | | | | | | | | | | | X | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | |
| 3 | 6 |  |  |  |  |  | | X | | | | | | | X | | | | X | | | | X | | | | | | | X | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | | X | | | | | | X | | | | | | | | | | | | | | | | X | | | | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | |
| 5 | 12 |  |  |  |  |  | | X | | | X | | | | | | | | | | | | | | | | | | | | | X | | | | X | | | X | | | | | | | | X | | | | | | | | X | X | X | X | X | X |
| CyberHarem/hinoka_fireemblem | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-17T13:12:55+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-17T14:32:48+00:00 |
a6a6b1f6582ff1c5a55240547fcbcd556ac1ed95 |
# Dataset Card for Evaluation run of Cartinoe5930/iDUS
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Cartinoe5930/iDUS](https://huggingface.co/Cartinoe5930/iDUS) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Cartinoe5930__iDUS",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-17T13:14:26.897278](https://huggingface.co/datasets/open-llm-leaderboard/details_Cartinoe5930__iDUS/blob/main/results_2024-01-17T13-14-26.897278.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.24846552462908655,
"acc_stderr": 0.030596938174151714,
"acc_norm": 0.24984616637570042,
"acc_norm_stderr": 0.031417483125595794,
"mc1": 0.22276621787025705,
"mc1_stderr": 0.014566506961396754,
"mc2": 0.48577541497626797,
"mc2_stderr": 0.016589496055636796
},
"harness|arc:challenge|25": {
"acc": 0.20733788395904437,
"acc_stderr": 0.011846905782971352,
"acc_norm": 0.2773037542662116,
"acc_norm_stderr": 0.013082095839059374
},
"harness|hellaswag|10": {
"acc": 0.26020713005377416,
"acc_stderr": 0.004378508362084367,
"acc_norm": 0.2664807807209719,
"acc_norm_stderr": 0.004412149415717922
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768081,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768081
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2,
"acc_stderr": 0.034554737023254366,
"acc_norm": 0.2,
"acc_norm_stderr": 0.034554737023254366
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.32894736842105265,
"acc_stderr": 0.03823428969926604,
"acc_norm": 0.32894736842105265,
"acc_norm_stderr": 0.03823428969926604
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21509433962264152,
"acc_stderr": 0.025288394502891366,
"acc_norm": 0.21509433962264152,
"acc_norm_stderr": 0.025288394502891366
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2638888888888889,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.2638888888888889,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749874,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.04724007352383889,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.04724007352383889
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.24680851063829787,
"acc_stderr": 0.02818544130123409,
"acc_norm": 0.24680851063829787,
"acc_norm_stderr": 0.02818544130123409
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.038351539543994194,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.038351539543994194
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.1793103448275862,
"acc_stderr": 0.031967664333731854,
"acc_norm": 0.1793103448275862,
"acc_norm_stderr": 0.031967664333731854
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2671957671957672,
"acc_stderr": 0.022789673145776564,
"acc_norm": 0.2671957671957672,
"acc_norm_stderr": 0.022789673145776564
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.04134913018303316,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.04134913018303316
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.18064516129032257,
"acc_stderr": 0.021886178567172548,
"acc_norm": 0.18064516129032257,
"acc_norm_stderr": 0.021886178567172548
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.15270935960591134,
"acc_stderr": 0.02530890453938063,
"acc_norm": 0.15270935960591134,
"acc_norm_stderr": 0.02530890453938063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.23030303030303031,
"acc_stderr": 0.03287666758603489,
"acc_norm": 0.23030303030303031,
"acc_norm_stderr": 0.03287666758603489
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.17676767676767677,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.17676767676767677,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860664,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860664
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.20256410256410257,
"acc_stderr": 0.020377660970371372,
"acc_norm": 0.20256410256410257,
"acc_norm_stderr": 0.020377660970371372
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.21851851851851853,
"acc_stderr": 0.02519575225182379,
"acc_norm": 0.21851851851851853,
"acc_norm_stderr": 0.02519575225182379
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.02665353159671549,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.02665353159671549
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.20917431192660552,
"acc_stderr": 0.01743793717334323,
"acc_norm": 0.20917431192660552,
"acc_norm_stderr": 0.01743793717334323
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4675925925925926,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.4675925925925926,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.030587591351604246,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.030587591351604246
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.20253164556962025,
"acc_stderr": 0.026160568246601457,
"acc_norm": 0.20253164556962025,
"acc_norm_stderr": 0.026160568246601457
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.31390134529147984,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.31390134529147984,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.04327040932578729,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.04327040932578729
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2905982905982906,
"acc_stderr": 0.02974504857267404,
"acc_norm": 0.2905982905982906,
"acc_norm_stderr": 0.02974504857267404
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909281,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909281
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.28607918263090676,
"acc_stderr": 0.016160871405127522,
"acc_norm": 0.28607918263090676,
"acc_norm_stderr": 0.016160871405127522
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.02392915551735128,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.02392915551735128
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.02212243977248077,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.02212243977248077
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432417,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432417
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24902216427640156,
"acc_stderr": 0.01104489226404077,
"acc_norm": 0.24902216427640156,
"acc_norm_stderr": 0.01104489226404077
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.19117647058823528,
"acc_stderr": 0.02388688192244034,
"acc_norm": 0.19117647058823528,
"acc_norm_stderr": 0.02388688192244034
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.01740181671142765,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.01740181671142765
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.18775510204081633,
"acc_stderr": 0.02500025603954621,
"acc_norm": 0.18775510204081633,
"acc_norm_stderr": 0.02500025603954621
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.0312678171466318,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.0312678171466318
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22276621787025705,
"mc1_stderr": 0.014566506961396754,
"mc2": 0.48577541497626797,
"mc2_stderr": 0.016589496055636796
},
"harness|winogrande|5": {
"acc": 0.49171270718232046,
"acc_stderr": 0.014050555322824194
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Cartinoe5930__iDUS | [
"region:us"
] | 2024-01-17T13:16:43+00:00 | {"pretty_name": "Evaluation run of Cartinoe5930/iDUS", "dataset_summary": "Dataset automatically created during the evaluation run of model [Cartinoe5930/iDUS](https://huggingface.co/Cartinoe5930/iDUS) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Cartinoe5930__iDUS\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-17T13:14:26.897278](https://huggingface.co/datasets/open-llm-leaderboard/details_Cartinoe5930__iDUS/blob/main/results_2024-01-17T13-14-26.897278.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.24846552462908655,\n \"acc_stderr\": 0.030596938174151714,\n \"acc_norm\": 0.24984616637570042,\n \"acc_norm_stderr\": 0.031417483125595794,\n \"mc1\": 0.22276621787025705,\n \"mc1_stderr\": 0.014566506961396754,\n \"mc2\": 0.48577541497626797,\n \"mc2_stderr\": 0.016589496055636796\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.20733788395904437,\n \"acc_stderr\": 0.011846905782971352,\n \"acc_norm\": 0.2773037542662116,\n \"acc_norm_stderr\": 0.013082095839059374\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.26020713005377416,\n \"acc_stderr\": 0.004378508362084367,\n \"acc_norm\": 0.2664807807209719,\n \"acc_norm_stderr\": 0.004412149415717922\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768081,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768081\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.034554737023254366,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.034554737023254366\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.32894736842105265,\n \"acc_stderr\": 0.03823428969926604,\n \"acc_norm\": 0.32894736842105265,\n \"acc_norm_stderr\": 0.03823428969926604\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.21509433962264152,\n \"acc_stderr\": 0.025288394502891366,\n \"acc_norm\": 0.21509433962264152,\n \"acc_norm_stderr\": 0.025288394502891366\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2638888888888889,\n \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.2638888888888889,\n \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.04724007352383889,\n \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.04724007352383889\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.24680851063829787,\n \"acc_stderr\": 0.02818544130123409,\n \"acc_norm\": 0.24680851063829787,\n \"acc_norm_stderr\": 0.02818544130123409\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.038351539543994194,\n \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.038351539543994194\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.1793103448275862,\n \"acc_stderr\": 0.031967664333731854,\n \"acc_norm\": 0.1793103448275862,\n \"acc_norm_stderr\": 0.031967664333731854\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2671957671957672,\n \"acc_stderr\": 0.022789673145776564,\n \"acc_norm\": 0.2671957671957672,\n \"acc_norm_stderr\": 0.022789673145776564\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30952380952380953,\n \"acc_stderr\": 0.04134913018303316,\n \"acc_norm\": 0.30952380952380953,\n \"acc_norm_stderr\": 0.04134913018303316\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.18064516129032257,\n \"acc_stderr\": 0.021886178567172548,\n \"acc_norm\": 0.18064516129032257,\n \"acc_norm_stderr\": 0.021886178567172548\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.15270935960591134,\n \"acc_stderr\": 0.02530890453938063,\n \"acc_norm\": 0.15270935960591134,\n \"acc_norm_stderr\": 0.02530890453938063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.23030303030303031,\n \"acc_stderr\": 0.03287666758603489,\n \"acc_norm\": 0.23030303030303031,\n \"acc_norm_stderr\": 0.03287666758603489\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.20256410256410257,\n \"acc_stderr\": 0.020377660970371372,\n \"acc_norm\": 0.20256410256410257,\n \"acc_norm_stderr\": 0.020377660970371372\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.21851851851851853,\n \"acc_stderr\": 0.02519575225182379,\n \"acc_norm\": 0.21851851851851853,\n \"acc_norm_stderr\": 0.02519575225182379\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.21428571428571427,\n \"acc_stderr\": 0.02665353159671549,\n \"acc_norm\": 0.21428571428571427,\n \"acc_norm_stderr\": 0.02665353159671549\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.20917431192660552,\n \"acc_stderr\": 0.01743793717334323,\n \"acc_norm\": 0.20917431192660552,\n \"acc_norm_stderr\": 0.01743793717334323\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4675925925925926,\n \"acc_stderr\": 0.03402801581358966,\n \"acc_norm\": 0.4675925925925926,\n \"acc_norm_stderr\": 0.03402801581358966\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.030587591351604246,\n \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.030587591351604246\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.20253164556962025,\n \"acc_stderr\": 0.026160568246601457,\n \"acc_norm\": 0.20253164556962025,\n \"acc_norm_stderr\": 0.026160568246601457\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n \"acc_stderr\": 0.04327040932578729,\n \"acc_norm\": 0.29464285714285715,\n \"acc_norm_stderr\": 0.04327040932578729\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909281,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909281\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.28607918263090676,\n \"acc_stderr\": 0.016160871405127522,\n \"acc_norm\": 0.28607918263090676,\n \"acc_norm_stderr\": 0.016160871405127522\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.02392915551735128,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.02392915551735128\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24902216427640156,\n \"acc_stderr\": 0.01104489226404077,\n \"acc_norm\": 0.24902216427640156,\n \"acc_norm_stderr\": 0.01104489226404077\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.19117647058823528,\n \"acc_stderr\": 0.02388688192244034,\n \"acc_norm\": 0.19117647058823528,\n \"acc_norm_stderr\": 0.02388688192244034\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.01740181671142765,\n \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.01740181671142765\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.18775510204081633,\n \"acc_stderr\": 0.02500025603954621,\n \"acc_norm\": 0.18775510204081633,\n \"acc_norm_stderr\": 0.02500025603954621\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.0312678171466318,\n \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.0312678171466318\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22276621787025705,\n \"mc1_stderr\": 0.014566506961396754,\n \"mc2\": 0.48577541497626797,\n \"mc2_stderr\": 0.016589496055636796\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.49171270718232046,\n \"acc_stderr\": 0.014050555322824194\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/Cartinoe5930/iDUS", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_17T13_14_26.897278", "path": ["**/details_harness|arc:challenge|25_2024-01-17T13-14-26.897278.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-17T13-14-26.897278.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_17T13_14_26.897278", "path": ["**/details_harness|gsm8k|5_2024-01-17T13-14-26.897278.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-17T13-14-26.897278.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_17T13_14_26.897278", "path": ["**/details_harness|hellaswag|10_2024-01-17T13-14-26.897278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-17T13-14-26.897278.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_17T13_14_26.897278", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-17T13-14-26.897278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-17T13-14-26.897278.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-17T13-14-26.897278.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_17T13_14_26.897278", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T13-14-26.897278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T13-14-26.897278.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_17T13_14_26.897278", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-17T13-14-26.897278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-17T13-14-26.897278.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_17T13_14_26.897278", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-17T13-14-26.897278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-17T13-14-26.897278.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_17T13_14_26.897278", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T13-14-26.897278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T13-14-26.897278.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_17T13_14_26.897278", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T13-14-26.897278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T13-14-26.897278.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_17T13_14_26.897278", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-17T13-14-26.897278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-17T13-14-26.897278.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_17T13_14_26.897278", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T13-14-26.897278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T13-14-26.897278.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_17T13_14_26.897278", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T13-14-26.897278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T13-14-26.897278.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_17T13_14_26.897278", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T13-14-26.897278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T13-14-26.897278.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_17T13_14_26.897278", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T13-14-26.897278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T13-14-26.897278.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_17T13_14_26.897278", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-17T13-14-26.897278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-17T13-14-26.897278.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_17T13_14_26.897278", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-17T13-14-26.897278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-17T13-14-26.897278.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_17T13_14_26.897278", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T13-14-26.897278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T13-14-26.897278.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_17T13_14_26.897278", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-17T13-14-26.897278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-17T13-14-26.897278.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_17T13_14_26.897278", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T13-14-26.897278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T13-14-26.897278.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_17T13_14_26.897278", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T13-14-26.897278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T13-14-26.897278.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_17T13_14_26.897278", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T13-14-26.897278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T13-14-26.897278.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_17T13_14_26.897278", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-17T13-14-26.897278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-17T13-14-26.897278.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_17T13_14_26.897278", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T13-14-26.897278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T13-14-26.897278.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_17T13_14_26.897278", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T13-14-26.897278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T13-14-26.897278.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_17T13_14_26.897278", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T13-14-26.897278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T13-14-26.897278.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_17T13_14_26.897278", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T13-14-26.897278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T13-14-26.897278.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_17T13_14_26.897278", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T13-14-26.897278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T13-14-26.897278.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_17T13_14_26.897278", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T13-14-26.897278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T13-14-26.897278.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_17T13_14_26.897278", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T13-14-26.897278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T13-14-26.897278.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_17T13_14_26.897278", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T13-14-26.897278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T13-14-26.897278.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_17T13_14_26.897278", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T13-14-26.897278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T13-14-26.897278.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_17T13_14_26.897278", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T13-14-26.897278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T13-14-26.897278.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_17T13_14_26.897278", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T13-14-26.897278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T13-14-26.897278.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_17T13_14_26.897278", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T13-14-26.897278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T13-14-26.897278.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_17T13_14_26.897278", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T13-14-26.897278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T13-14-26.897278.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_17T13_14_26.897278", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T13-14-26.897278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T13-14-26.897278.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_17T13_14_26.897278", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-17T13-14-26.897278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-17T13-14-26.897278.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_17T13_14_26.897278", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T13-14-26.897278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T13-14-26.897278.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_17T13_14_26.897278", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-17T13-14-26.897278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-17T13-14-26.897278.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_17T13_14_26.897278", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T13-14-26.897278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T13-14-26.897278.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_17T13_14_26.897278", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T13-14-26.897278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T13-14-26.897278.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_17T13_14_26.897278", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T13-14-26.897278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T13-14-26.897278.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_17T13_14_26.897278", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-17T13-14-26.897278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-17T13-14-26.897278.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_17T13_14_26.897278", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-17T13-14-26.897278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-17T13-14-26.897278.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_17T13_14_26.897278", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T13-14-26.897278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T13-14-26.897278.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_17T13_14_26.897278", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T13-14-26.897278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T13-14-26.897278.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_17T13_14_26.897278", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T13-14-26.897278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T13-14-26.897278.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_17T13_14_26.897278", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T13-14-26.897278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T13-14-26.897278.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_17T13_14_26.897278", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-17T13-14-26.897278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-17T13-14-26.897278.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_17T13_14_26.897278", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-17T13-14-26.897278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-17T13-14-26.897278.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_17T13_14_26.897278", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-17T13-14-26.897278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-17T13-14-26.897278.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_17T13_14_26.897278", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T13-14-26.897278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T13-14-26.897278.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_17T13_14_26.897278", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-17T13-14-26.897278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-17T13-14-26.897278.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_17T13_14_26.897278", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T13-14-26.897278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T13-14-26.897278.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_17T13_14_26.897278", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T13-14-26.897278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T13-14-26.897278.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_17T13_14_26.897278", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-17T13-14-26.897278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-17T13-14-26.897278.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_17T13_14_26.897278", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-17T13-14-26.897278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-17T13-14-26.897278.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_17T13_14_26.897278", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-17T13-14-26.897278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-17T13-14-26.897278.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_17T13_14_26.897278", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T13-14-26.897278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T13-14-26.897278.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_17T13_14_26.897278", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-17T13-14-26.897278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-17T13-14-26.897278.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_17T13_14_26.897278", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-17T13-14-26.897278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-17T13-14-26.897278.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_17T13_14_26.897278", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-17T13-14-26.897278.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-17T13-14-26.897278.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_17T13_14_26.897278", "path": ["**/details_harness|winogrande|5_2024-01-17T13-14-26.897278.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-17T13-14-26.897278.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_17T13_14_26.897278", "path": ["results_2024-01-17T13-14-26.897278.parquet"]}, {"split": "latest", "path": ["results_2024-01-17T13-14-26.897278.parquet"]}]}]} | 2024-01-17T13:17:05+00:00 |
5927da75e3fa805c7748cfbcdd239d167a1c4105 |
This is a cleaned version of the Quora dataset that's been configured with a train-test-val split.
- Train : For training model
- Test : For running experiments and comparing different OSS models and closed sourced models
- Val : Only to be used at the **end**!
Colab Notebook to reproduce : https://colab.research.google.com/drive/1dGjGiqwPV1M7JOLfcPEsSh3SC37urItS?usp=sharing | 567-labs/cleaned-quora-dataset-train-test-split | [
"region:us"
] | 2024-01-17T13:18:52+00:00 | {"dataset_info": {"features": [{"name": "questions", "struct": [{"name": "id", "sequence": "int64"}, {"name": "text", "sequence": "string"}]}, {"name": "is_duplicate", "dtype": "bool"}], "splits": [{"name": "train", "num_bytes": 39231843, "num_examples": 261317}, {"name": "test", "num_bytes": 7005599, "num_examples": 44635}, {"name": "val", "num_bytes": 6704734, "num_examples": 42232}], "download_size": 31031925, "dataset_size": 52942176}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}, {"split": "val", "path": "data/val-*"}]}]} | 2024-01-29T14:44:01+00:00 |
59a030207046255f521a01dd9219b2f952e5844c |
# Dataset Card for Evaluation run of zhengr/MixTAO-7Bx2-MoE-DPO
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [zhengr/MixTAO-7Bx2-MoE-DPO](https://huggingface.co/zhengr/MixTAO-7Bx2-MoE-DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_zhengr__MixTAO-7Bx2-MoE-DPO",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-17T13:23:29.676681](https://huggingface.co/datasets/open-llm-leaderboard/details_zhengr__MixTAO-7Bx2-MoE-DPO/blob/main/results_2024-01-17T13-23-29.676681.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6522594762182442,
"acc_stderr": 0.0319843897745183,
"acc_norm": 0.6520128707870778,
"acc_norm_stderr": 0.03264277329988372,
"mc1": 0.5385556915544676,
"mc1_stderr": 0.017451384104637455,
"mc2": 0.6934208246675816,
"mc2_stderr": 0.014891018416465928
},
"harness|arc:challenge|25": {
"acc": 0.6808873720136519,
"acc_stderr": 0.01362169611917331,
"acc_norm": 0.7090443686006825,
"acc_norm_stderr": 0.013273077865907595
},
"harness|hellaswag|10": {
"acc": 0.7027484564827724,
"acc_stderr": 0.004561141293448457,
"acc_norm": 0.8712407886875124,
"acc_norm_stderr": 0.003342487333262275
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.027834912527544067,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.027834912527544067
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.035868792800803406,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.035868792800803406
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.048786087144669955,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.048786087144669955
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.025331202438944427,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.025331202438944427
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.023157879349083522,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.023157879349083522
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.03287666758603491,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.03287666758603491
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.02860620428922987,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.02860620428922987
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.020986854593289715,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.020986854593289715
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402534,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402534
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.03006676158297793,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.03006676158297793
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8550458715596331,
"acc_stderr": 0.015094215699700486,
"acc_norm": 0.8550458715596331,
"acc_norm_stderr": 0.015094215699700486
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5648148148148148,
"acc_stderr": 0.033812000056435254,
"acc_norm": 0.5648148148148148,
"acc_norm_stderr": 0.033812000056435254
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8627450980392157,
"acc_stderr": 0.024152225962801584,
"acc_norm": 0.8627450980392157,
"acc_norm_stderr": 0.024152225962801584
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8143459915611815,
"acc_stderr": 0.02531049537694486,
"acc_norm": 0.8143459915611815,
"acc_norm_stderr": 0.02531049537694486
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.03149384670994131,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.03149384670994131
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.0384985609879409,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.0384985609879409
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.02158649400128138,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.02158649400128138
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8250319284802043,
"acc_stderr": 0.013586619219903335,
"acc_norm": 0.8250319284802043,
"acc_norm_stderr": 0.013586619219903335
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7167630057803468,
"acc_stderr": 0.02425790170532338,
"acc_norm": 0.7167630057803468,
"acc_norm_stderr": 0.02425790170532338
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4212290502793296,
"acc_stderr": 0.01651367603117959,
"acc_norm": 0.4212290502793296,
"acc_norm_stderr": 0.01651367603117959
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.026090162504279056,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.026090162504279056
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.025583062489984813,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.025583062489984813
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.023468429832451156,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.023468429832451156
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5070921985815603,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.5070921985815603,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4771838331160365,
"acc_stderr": 0.012756933382823698,
"acc_norm": 0.4771838331160365,
"acc_norm_stderr": 0.012756933382823698
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.028245687391462927,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.028245687391462927
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6683006535947712,
"acc_stderr": 0.01904748523936038,
"acc_norm": 0.6683006535947712,
"acc_norm_stderr": 0.01904748523936038
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.02797982353874455,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.02797982353874455
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.02619392354445412,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.02619392354445412
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.02954774168764004,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.02954774168764004
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5385556915544676,
"mc1_stderr": 0.017451384104637455,
"mc2": 0.6934208246675816,
"mc2_stderr": 0.014891018416465928
},
"harness|winogrande|5": {
"acc": 0.8121546961325967,
"acc_stderr": 0.010977481103435091
},
"harness|gsm8k|5": {
"acc": 0.7035633055344959,
"acc_stderr": 0.012579398235589534
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_zhengr__MixTAO-7Bx2-MoE-DPO | [
"region:us"
] | 2024-01-17T13:25:46+00:00 | {"pretty_name": "Evaluation run of zhengr/MixTAO-7Bx2-MoE-DPO", "dataset_summary": "Dataset automatically created during the evaluation run of model [zhengr/MixTAO-7Bx2-MoE-DPO](https://huggingface.co/zhengr/MixTAO-7Bx2-MoE-DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_zhengr__MixTAO-7Bx2-MoE-DPO\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-17T13:23:29.676681](https://huggingface.co/datasets/open-llm-leaderboard/details_zhengr__MixTAO-7Bx2-MoE-DPO/blob/main/results_2024-01-17T13-23-29.676681.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6522594762182442,\n \"acc_stderr\": 0.0319843897745183,\n \"acc_norm\": 0.6520128707870778,\n \"acc_norm_stderr\": 0.03264277329988372,\n \"mc1\": 0.5385556915544676,\n \"mc1_stderr\": 0.017451384104637455,\n \"mc2\": 0.6934208246675816,\n \"mc2_stderr\": 0.014891018416465928\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6808873720136519,\n \"acc_stderr\": 0.01362169611917331,\n \"acc_norm\": 0.7090443686006825,\n \"acc_norm_stderr\": 0.013273077865907595\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7027484564827724,\n \"acc_stderr\": 0.004561141293448457,\n \"acc_norm\": 0.8712407886875124,\n \"acc_norm_stderr\": 0.003342487333262275\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.027834912527544067,\n \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.027834912527544067\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n \"acc_stderr\": 0.035868792800803406,\n \"acc_norm\": 0.7569444444444444,\n \"acc_norm_stderr\": 0.035868792800803406\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.048786087144669955,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.048786087144669955\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41005291005291006,\n \"acc_stderr\": 0.025331202438944427,\n \"acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.025331202438944427\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n \"acc_stderr\": 0.023157879349083522,\n \"acc_norm\": 0.7903225806451613,\n \"acc_norm_stderr\": 0.023157879349083522\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.03287666758603491,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.03287666758603491\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.797979797979798,\n \"acc_stderr\": 0.02860620428922987,\n \"acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.02860620428922987\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.020986854593289715,\n \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.020986854593289715\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402534,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402534\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.03006676158297793,\n \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.03006676158297793\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8550458715596331,\n \"acc_stderr\": 0.015094215699700486,\n \"acc_norm\": 0.8550458715596331,\n \"acc_norm_stderr\": 0.015094215699700486\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5648148148148148,\n \"acc_stderr\": 0.033812000056435254,\n \"acc_norm\": 0.5648148148148148,\n \"acc_norm_stderr\": 0.033812000056435254\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8627450980392157,\n \"acc_stderr\": 0.024152225962801584,\n \"acc_norm\": 0.8627450980392157,\n \"acc_norm_stderr\": 0.024152225962801584\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8143459915611815,\n \"acc_stderr\": 0.02531049537694486,\n \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.02531049537694486\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.0384985609879409,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.0384985609879409\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n \"acc_stderr\": 0.02158649400128138,\n \"acc_norm\": 0.8760683760683761,\n \"acc_norm_stderr\": 0.02158649400128138\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8250319284802043,\n \"acc_stderr\": 0.013586619219903335,\n \"acc_norm\": 0.8250319284802043,\n \"acc_norm_stderr\": 0.013586619219903335\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.02425790170532338,\n \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.02425790170532338\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4212290502793296,\n \"acc_stderr\": 0.01651367603117959,\n \"acc_norm\": 0.4212290502793296,\n \"acc_norm_stderr\": 0.01651367603117959\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.026090162504279056,\n \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.026090162504279056\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n \"acc_stderr\": 0.025583062489984813,\n \"acc_norm\": 0.7170418006430869,\n \"acc_norm_stderr\": 0.025583062489984813\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.023468429832451156,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.023468429832451156\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5070921985815603,\n \"acc_stderr\": 0.02982449855912901,\n \"acc_norm\": 0.5070921985815603,\n \"acc_norm_stderr\": 0.02982449855912901\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4771838331160365,\n \"acc_stderr\": 0.012756933382823698,\n \"acc_norm\": 0.4771838331160365,\n \"acc_norm_stderr\": 0.012756933382823698\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462927,\n \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462927\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6683006535947712,\n \"acc_stderr\": 0.01904748523936038,\n \"acc_norm\": 0.6683006535947712,\n \"acc_norm_stderr\": 0.01904748523936038\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.02797982353874455,\n \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.02797982353874455\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.02619392354445412,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.02619392354445412\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.02954774168764004,\n \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.02954774168764004\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5385556915544676,\n \"mc1_stderr\": 0.017451384104637455,\n \"mc2\": 0.6934208246675816,\n \"mc2_stderr\": 0.014891018416465928\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8121546961325967,\n \"acc_stderr\": 0.010977481103435091\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7035633055344959,\n \"acc_stderr\": 0.012579398235589534\n }\n}\n```", "repo_url": "https://huggingface.co/zhengr/MixTAO-7Bx2-MoE-DPO", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_17T13_23_29.676681", "path": ["**/details_harness|arc:challenge|25_2024-01-17T13-23-29.676681.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-17T13-23-29.676681.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_17T13_23_29.676681", "path": ["**/details_harness|gsm8k|5_2024-01-17T13-23-29.676681.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-17T13-23-29.676681.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_17T13_23_29.676681", "path": ["**/details_harness|hellaswag|10_2024-01-17T13-23-29.676681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-17T13-23-29.676681.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_17T13_23_29.676681", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-17T13-23-29.676681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-17T13-23-29.676681.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-17T13-23-29.676681.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_17T13_23_29.676681", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T13-23-29.676681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T13-23-29.676681.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_17T13_23_29.676681", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-17T13-23-29.676681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-17T13-23-29.676681.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_17T13_23_29.676681", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-17T13-23-29.676681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-17T13-23-29.676681.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_17T13_23_29.676681", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T13-23-29.676681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T13-23-29.676681.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_17T13_23_29.676681", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T13-23-29.676681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T13-23-29.676681.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_17T13_23_29.676681", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-17T13-23-29.676681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-17T13-23-29.676681.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_17T13_23_29.676681", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T13-23-29.676681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T13-23-29.676681.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_17T13_23_29.676681", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T13-23-29.676681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T13-23-29.676681.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_17T13_23_29.676681", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T13-23-29.676681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T13-23-29.676681.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_17T13_23_29.676681", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T13-23-29.676681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T13-23-29.676681.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_17T13_23_29.676681", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-17T13-23-29.676681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-17T13-23-29.676681.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_17T13_23_29.676681", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-17T13-23-29.676681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-17T13-23-29.676681.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_17T13_23_29.676681", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T13-23-29.676681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T13-23-29.676681.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_17T13_23_29.676681", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-17T13-23-29.676681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-17T13-23-29.676681.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_17T13_23_29.676681", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T13-23-29.676681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T13-23-29.676681.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_17T13_23_29.676681", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T13-23-29.676681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T13-23-29.676681.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_17T13_23_29.676681", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T13-23-29.676681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T13-23-29.676681.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_17T13_23_29.676681", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-17T13-23-29.676681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-17T13-23-29.676681.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_17T13_23_29.676681", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T13-23-29.676681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T13-23-29.676681.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_17T13_23_29.676681", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T13-23-29.676681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T13-23-29.676681.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_17T13_23_29.676681", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T13-23-29.676681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T13-23-29.676681.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_17T13_23_29.676681", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T13-23-29.676681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T13-23-29.676681.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_17T13_23_29.676681", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T13-23-29.676681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T13-23-29.676681.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_17T13_23_29.676681", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T13-23-29.676681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T13-23-29.676681.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_17T13_23_29.676681", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T13-23-29.676681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T13-23-29.676681.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_17T13_23_29.676681", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T13-23-29.676681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T13-23-29.676681.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_17T13_23_29.676681", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T13-23-29.676681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T13-23-29.676681.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_17T13_23_29.676681", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T13-23-29.676681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T13-23-29.676681.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_17T13_23_29.676681", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T13-23-29.676681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T13-23-29.676681.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_17T13_23_29.676681", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T13-23-29.676681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T13-23-29.676681.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_17T13_23_29.676681", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T13-23-29.676681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T13-23-29.676681.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_17T13_23_29.676681", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T13-23-29.676681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T13-23-29.676681.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_17T13_23_29.676681", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-17T13-23-29.676681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-17T13-23-29.676681.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_17T13_23_29.676681", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T13-23-29.676681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T13-23-29.676681.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_17T13_23_29.676681", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-17T13-23-29.676681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-17T13-23-29.676681.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_17T13_23_29.676681", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T13-23-29.676681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T13-23-29.676681.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_17T13_23_29.676681", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T13-23-29.676681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T13-23-29.676681.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_17T13_23_29.676681", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T13-23-29.676681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T13-23-29.676681.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_17T13_23_29.676681", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-17T13-23-29.676681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-17T13-23-29.676681.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_17T13_23_29.676681", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-17T13-23-29.676681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-17T13-23-29.676681.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_17T13_23_29.676681", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T13-23-29.676681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T13-23-29.676681.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_17T13_23_29.676681", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T13-23-29.676681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T13-23-29.676681.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_17T13_23_29.676681", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T13-23-29.676681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T13-23-29.676681.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_17T13_23_29.676681", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T13-23-29.676681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T13-23-29.676681.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_17T13_23_29.676681", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-17T13-23-29.676681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-17T13-23-29.676681.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_17T13_23_29.676681", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-17T13-23-29.676681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-17T13-23-29.676681.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_17T13_23_29.676681", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-17T13-23-29.676681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-17T13-23-29.676681.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_17T13_23_29.676681", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T13-23-29.676681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T13-23-29.676681.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_17T13_23_29.676681", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-17T13-23-29.676681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-17T13-23-29.676681.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_17T13_23_29.676681", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T13-23-29.676681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T13-23-29.676681.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_17T13_23_29.676681", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T13-23-29.676681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T13-23-29.676681.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_17T13_23_29.676681", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-17T13-23-29.676681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-17T13-23-29.676681.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_17T13_23_29.676681", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-17T13-23-29.676681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-17T13-23-29.676681.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_17T13_23_29.676681", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-17T13-23-29.676681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-17T13-23-29.676681.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_17T13_23_29.676681", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T13-23-29.676681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T13-23-29.676681.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_17T13_23_29.676681", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-17T13-23-29.676681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-17T13-23-29.676681.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_17T13_23_29.676681", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-17T13-23-29.676681.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-17T13-23-29.676681.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_17T13_23_29.676681", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-17T13-23-29.676681.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-17T13-23-29.676681.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_17T13_23_29.676681", "path": ["**/details_harness|winogrande|5_2024-01-17T13-23-29.676681.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-17T13-23-29.676681.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_17T13_23_29.676681", "path": ["results_2024-01-17T13-23-29.676681.parquet"]}, {"split": "latest", "path": ["results_2024-01-17T13-23-29.676681.parquet"]}]}]} | 2024-01-17T13:26:07+00:00 |
5d92806e2e296fc269c18397cb5b45e86b4b58bb |
# Dataset of pieri/ピエリ (Fire Emblem)
This is the dataset of pieri/ピエリ (Fire Emblem), containing 194 images and their tags.
The core tags of this character are `blue_hair, multicolored_hair, hair_over_one_eye, pink_hair, twintails, breasts, two-tone_hair, red_eyes, gradient_hair, large_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 194 | 216.36 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pieri_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 194 | 127.01 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pieri_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 427 | 258.80 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pieri_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 194 | 190.34 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pieri_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 427 | 357.45 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pieri_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/pieri_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 7 |  |  |  |  |  | 1girl, simple_background, solo, armor, white_background, smile, looking_at_viewer, sword, upper_body |
| 1 | 10 |  |  |  |  |  | 1girl, armor, solo, spear, open_mouth |
| 2 | 14 |  |  |  |  |  | 1girl, solo, nipples, nude, pussy, smile, blush, looking_at_viewer, uncensored, navel |
| 3 | 7 |  |  |  |  |  | blush, nipples, nude, solo_focus, 1boy, 1girl, cum_on_breasts, hetero, smile, cum_on_hair, facial, paizuri, penis, censored, closed_eyes, collarbone, long_hair, open_mouth |
| 4 | 7 |  |  |  |  |  | 1boy, 1girl, hetero, nipples, solo_focus, open_mouth, blush, pink_eyes, ahegao, completely_nude, medium_breasts, sex_from_behind, simple_background, tongue_out, arm_grab, arm_held_back, navel, standing_sex, vaginal |
| 5 | 9 |  |  |  |  |  | 1boy, fellatio, hetero, penis, 1girl, solo_focus, uncensored, nude, blush, english_text, simple_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | simple_background | solo | armor | white_background | smile | looking_at_viewer | sword | upper_body | spear | open_mouth | nipples | nude | pussy | blush | uncensored | navel | solo_focus | 1boy | cum_on_breasts | hetero | cum_on_hair | facial | paizuri | penis | censored | closed_eyes | collarbone | long_hair | pink_eyes | ahegao | completely_nude | medium_breasts | sex_from_behind | tongue_out | arm_grab | arm_held_back | standing_sex | vaginal | fellatio | english_text |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:--------|:-------------------|:--------|:--------------------|:--------|:-------------|:--------|:-------------|:----------|:-------|:--------|:--------|:-------------|:--------|:-------------|:-------|:-----------------|:---------|:--------------|:---------|:----------|:--------|:-----------|:--------------|:-------------|:------------|:------------|:---------|:------------------|:-----------------|:------------------|:-------------|:-----------|:----------------|:---------------|:----------|:-----------|:---------------|
| 0 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 10 |  |  |  |  |  | X | | X | X | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 14 |  |  |  |  |  | X | | X | | | X | X | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 7 |  |  |  |  |  | X | | | | | X | | | | | X | X | X | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 4 | 7 |  |  |  |  |  | X | X | | | | | | | | | X | X | | | X | | X | X | X | | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | |
| 5 | 9 |  |  |  |  |  | X | X | | | | | | | | | | | X | | X | X | | X | X | | X | | | | X | | | | | | | | | | | | | | | X | X |
| CyberHarem/pieri_fireemblem | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-17T13:30:49+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-17T14:06:52+00:00 |
e781bb38cf9b3106cd9523b300e2235c30123f8b |
# Dataset of oboro/オボロ (Fire Emblem)
This is the dataset of oboro/オボロ (Fire Emblem), containing 108 images and their tags.
The core tags of this character are `long_hair, ponytail, blue_hair, brown_eyes, breasts, braid`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 108 | 91.31 MiB | [Download](https://huggingface.co/datasets/CyberHarem/oboro_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 108 | 66.02 MiB | [Download](https://huggingface.co/datasets/CyberHarem/oboro_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 235 | 125.87 MiB | [Download](https://huggingface.co/datasets/CyberHarem/oboro_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 108 | 86.24 MiB | [Download](https://huggingface.co/datasets/CyberHarem/oboro_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 235 | 155.64 MiB | [Download](https://huggingface.co/datasets/CyberHarem/oboro_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/oboro_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 11 |  |  |  |  |  | 1girl, solo, japanese_clothes, armor, naginata, simple_background, smile, white_background, looking_at_viewer, spear |
| 1 | 12 |  |  |  |  |  | 1girl, hetero, penis, solo_focus, blush, nipples, sex, vaginal, 1boy, nude, large_breasts, cum_in_pussy, navel, sweat, bar_censor, medium_breasts, open_mouth |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | japanese_clothes | armor | naginata | simple_background | smile | white_background | looking_at_viewer | spear | hetero | penis | solo_focus | blush | nipples | sex | vaginal | 1boy | nude | large_breasts | cum_in_pussy | navel | sweat | bar_censor | medium_breasts | open_mouth |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:-------------------|:--------|:-----------|:--------------------|:--------|:-------------------|:--------------------|:--------|:---------|:--------|:-------------|:--------|:----------|:------|:----------|:-------|:-------|:----------------|:---------------|:--------|:--------|:-------------|:-----------------|:-------------|
| 0 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 1 | 12 |  |  |  |  |  | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/oboro_fireemblem | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-17T13:31:00+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-17T13:52:06+00:00 |
0ec15d902aae33d887074375c92772e07d5dc2a0 |
# Dataset of est/エスト (Fire Emblem)
This is the dataset of est/エスト (Fire Emblem), containing 103 images and their tags.
The core tags of this character are `short_hair, pink_hair, breasts, red_hair, pink_eyes, headband, red_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 103 | 94.24 MiB | [Download](https://huggingface.co/datasets/CyberHarem/est_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 103 | 62.07 MiB | [Download](https://huggingface.co/datasets/CyberHarem/est_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 208 | 113.96 MiB | [Download](https://huggingface.co/datasets/CyberHarem/est_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 103 | 86.92 MiB | [Download](https://huggingface.co/datasets/CyberHarem/est_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 208 | 149.02 MiB | [Download](https://huggingface.co/datasets/CyberHarem/est_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/est_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 |  |  |  |  |  | hetero, 1girl, blush, cum_in_pussy, thighhighs, armor, solo_focus, vaginal, elbow_gloves, heart-shaped_pupils, open_mouth, spread_legs, breasts_out, clothed_sex, nipples, overflow, 1boy, 2boys, censored, fingerless_gloves, group_sex, medium_breasts, multiple_penises, torn_clothes |
| 1 | 6 |  |  |  |  |  | 1girl, nipples, solo, thighhighs, blush, cross-laced_footwear, looking_at_viewer, nude, open_mouth, smile, thigh_boots, pussy, medium_breasts, spread_legs |
| 2 | 6 |  |  |  |  |  | 1girl, elbow_gloves, solo, spear, thighhighs, zettai_ryouiki, dress, smile, thigh_boots, open_mouth, sword, breastplate, fingerless_gloves, full_body, holding, looking_at_viewer, pegasus_knight_uniform_(fire_emblem), shoulder_armor |
| 3 | 6 |  |  |  |  |  | 1girl, bangs, belt, elbow_gloves, full_body, open_mouth, sheath, short_dress, shoulder_armor, solo, sword, thigh_boots, thighhighs, holding_weapon, shiny_hair, simple_background, spear, white_background, white_gloves, gold_trim, leg_up, medium_breasts, pelvic_curtain, thighs, white_dress, white_footwear, zettai_ryouiki, high_heels, sleeveless |
| 4 | 11 |  |  |  |  |  | rabbit_ears, 1girl, fake_animal_ears, smile, solo, open_mouth, see-through, thighhighs, hair_ornament, choker, easter_egg, white_gloves, dress, flower, full_body, simple_background, looking_at_viewer, official_alternate_costume, shorts, white_background, bangs, holding, puffy_short_sleeves, shiny_hair |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | hetero | 1girl | blush | cum_in_pussy | thighhighs | armor | solo_focus | vaginal | elbow_gloves | heart-shaped_pupils | open_mouth | spread_legs | breasts_out | clothed_sex | nipples | overflow | 1boy | 2boys | censored | fingerless_gloves | group_sex | medium_breasts | multiple_penises | torn_clothes | solo | cross-laced_footwear | looking_at_viewer | nude | smile | thigh_boots | pussy | spear | zettai_ryouiki | dress | sword | breastplate | full_body | holding | pegasus_knight_uniform_(fire_emblem) | shoulder_armor | bangs | belt | sheath | short_dress | holding_weapon | shiny_hair | simple_background | white_background | white_gloves | gold_trim | leg_up | pelvic_curtain | thighs | white_dress | white_footwear | high_heels | sleeveless | rabbit_ears | fake_animal_ears | see-through | hair_ornament | choker | easter_egg | flower | official_alternate_costume | shorts | puffy_short_sleeves |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------|:--------|:--------|:---------------|:-------------|:--------|:-------------|:----------|:---------------|:----------------------|:-------------|:--------------|:--------------|:--------------|:----------|:-----------|:-------|:--------|:-----------|:--------------------|:------------|:-----------------|:-------------------|:---------------|:-------|:-----------------------|:--------------------|:-------|:--------|:--------------|:--------|:--------|:-----------------|:--------|:--------|:--------------|:------------|:----------|:---------------------------------------|:-----------------|:--------|:-------|:---------|:--------------|:-----------------|:-------------|:--------------------|:-------------------|:---------------|:------------|:---------|:-----------------|:---------|:--------------|:-----------------|:-------------|:-------------|:--------------|:-------------------|:--------------|:----------------|:---------|:-------------|:---------|:-----------------------------|:---------|:----------------------|
| 0 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | | X | X | | X | | | | | | X | X | | | X | | | | | | | X | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | | X | | | X | | | | X | | X | | | | | | | | | X | | | | | X | | X | | X | X | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 6 |  |  |  |  |  | | X | | | X | | | | X | | X | | | | | | | | | | | X | | | X | | | | | X | | X | X | | X | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | |
| 4 | 11 |  |  |  |  |  | | X | | | X | | | | | | X | | | | | | | | | | | | | | X | | X | | X | | | | | X | | | X | X | | | X | | | | | X | X | X | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/est_fireemblem | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-17T13:31:06+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-17T13:54:20+00:00 |
5b55289694668673fc16cad900ae7a2ab2c91f47 |
# Dataset of sharon/シャロン (Fire Emblem)
This is the dataset of sharon/シャロン (Fire Emblem), containing 196 images and their tags.
The core tags of this character are `blonde_hair, green_eyes, long_hair, braid, breasts, crown_braid`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 196 | 202.46 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sharon_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 196 | 132.60 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sharon_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 415 | 251.62 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sharon_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 196 | 185.17 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sharon_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 415 | 326.59 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sharon_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/sharon_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 14 |  |  |  |  |  | 1boy, 1girl, hetero, nipples, sex, blush, medium_breasts, penis, solo_focus, open_mouth, completely_nude, cum_in_pussy, vaginal, navel, spread_legs, mosaic_censoring, thighhighs |
| 1 | 18 |  |  |  |  |  | fake_animal_ears, rabbit_ears, smile, 1girl, solo, simple_background, open_mouth, white_gloves, multicolored_hair, looking_at_viewer, rabbit_tail, pantyhose, blush, playboy_bunny, easter_egg, medium_breasts, pink_hair, cleavage_cutout, one_eye_closed, ponytail |
| 2 | 21 |  |  |  |  |  | 1girl, simple_background, solo, smile, armor, gloves, looking_at_viewer, white_background, multicolored_hair, open_mouth, blush, cape, upper_body |
| 3 | 16 |  |  |  |  |  | 1girl, armor, cape, solo, spear, looking_at_viewer, simple_background, smile, thighhighs, brown_gloves, holding_polearm, open_mouth, skirt, white_background, gradient_hair, pink_hair, shield, thigh_boots |
| 4 | 5 |  |  |  |  |  | 1girl, kimono, obi, open_mouth, solo, floral_print, hair_ornament, long_sleeves, smile, wide_sleeves, bangs, flower, gradient_hair, looking_at_viewer, pink_hair, full_body, fur_trim, gradient_clothes, holding, low-tied_long_hair, sandals, shiny_hair, tabi, transparent_background, upper_body |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1boy | 1girl | hetero | nipples | sex | blush | medium_breasts | penis | solo_focus | open_mouth | completely_nude | cum_in_pussy | vaginal | navel | spread_legs | mosaic_censoring | thighhighs | fake_animal_ears | rabbit_ears | smile | solo | simple_background | white_gloves | multicolored_hair | looking_at_viewer | rabbit_tail | pantyhose | playboy_bunny | easter_egg | pink_hair | cleavage_cutout | one_eye_closed | ponytail | armor | gloves | white_background | cape | upper_body | spear | brown_gloves | holding_polearm | skirt | gradient_hair | shield | thigh_boots | kimono | obi | floral_print | hair_ornament | long_sleeves | wide_sleeves | bangs | flower | full_body | fur_trim | gradient_clothes | holding | low-tied_long_hair | sandals | shiny_hair | tabi | transparent_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------|:--------|:---------|:----------|:------|:--------|:-----------------|:--------|:-------------|:-------------|:------------------|:---------------|:----------|:--------|:--------------|:-------------------|:-------------|:-------------------|:--------------|:--------|:-------|:--------------------|:---------------|:--------------------|:--------------------|:--------------|:------------|:----------------|:-------------|:------------|:------------------|:-----------------|:-----------|:--------|:---------|:-------------------|:-------|:-------------|:--------|:---------------|:------------------|:--------|:----------------|:---------|:--------------|:---------|:------|:---------------|:----------------|:---------------|:---------------|:--------|:---------|:------------|:-----------|:-------------------|:----------|:---------------------|:----------|:-------------|:-------|:-------------------------|
| 0 | 14 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 18 |  |  |  |  |  | | X | | | | X | X | | | X | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 21 |  |  |  |  |  | | X | | | | X | | | | X | | | | | | | | | | X | X | X | | X | X | | | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 16 |  |  |  |  |  | | X | | | | | | | | X | | | | | | | X | | | X | X | X | | | X | | | | | X | | | | X | | X | X | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | | X | | | | | | | | X | | | | | | | | | | X | X | | | | X | | | | | X | | | | | | | | X | | | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/sharon_fireemblem | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-17T13:31:40+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-17T14:19:58+00:00 |
f7182dca4f1ae3a0e31b4fb2ab18834eeb4b7d77 | # Dataset Card for "VietnameseBookCorpus-raw-parquet"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | tmnam20/VietnameseBookCorpus-raw-parquet | [
"region:us"
] | 2024-01-17T13:35:23+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 4928921669, "num_examples": 19287}], "download_size": 2543402734, "dataset_size": 4928921669}} | 2024-01-17T13:41:47+00:00 |
ad7ff55d87443132aadff208a19b700e676e2021 |
# Portuguese-Corpus
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://nkluge-correa.github.io/TeenyTinyLlama/
- **Repository:** https://github.com/Nkluge-correa/TeenyTinyLlama
- **Paper:** [TeenyTinyLlama: open-source tiny language models trained in Brazilian Portuguese](https://arxiv.org/abs/2401.16640)
- **Point of Contact:** [AIRES at PUCRS](mailto:[email protected])
### Dataset Summary
Portuguese-Corpus is a concatenation of several portions of Brazilian Portuguese datasets found in the [Hub](https://huggingface.co/datasets?task_categories=task_categories:text-generation&language=language:pt&sort=trending).
In a tokenized format, the dataset (uncompressed) weighs 50 GB and has approximately 4.1B tokens. This version does not have instructional content.
### Supported Tasks and Leaderboards
This dataset can be utilized for tasks involving language modeling.
### Languages
Portuguese.
## Dataset Structure
### Data Instances
The dataset consists of the following features:
- **text:** a string of text in Portuguese.
- **metadata:** the source where that string originated.
### Data Fields
```python
{
"text": "A inteligência artificial (de sigla: IA; do inglês: artificial intelligence, de sigla: AI) é um campo de estudo multidisciplinar que abrange varias áreas do conhecimento.",
"metadata": "source: https://huggingface.co/datasets/graelo/wikipedia"
}
```
### Data Splits
Available splits are `train`.
```python
from datasets import load_dataset
dataset = load_dataset("nicholasKluge/Pt-Corpus", split='train')
# If you don't want to download the entire dataset, set streaming to `True`
dataset = load_dataset("nicholasKluge/Pt-Corpus", split='train', streaming=True)
```
## Dataset Creation
### Curation Rationale
This dataset was developed are part of the [TeenyTinyLlama: open-source tiny language models trained in Brazilian Portuguese](https://arxiv.org/abs/2401.16640) paper. In this study, we document the development of open-foundation models tailored for use in low-resource settings, their limitations, and their benefits.
### Source Data
#### Initial Data Collection and Normalization
We utilized some of the filters used in Rae et al. ([2021](https://arxiv.org/abs/2112.11446)), besides using a [fine-tuned BERTimbau](https://huggingface.co/nicholasKluge/ToxicityModelPT) to exclude samples classified above a pre-defined toxicity threshold.
#### Who are the source language producers?
All text samples are native to Portuguese or translated from other languages to Portuguese (slight contamination of other languages should also be expected).
### Annotations
#### Annotation process
Portuguese-Corpus is a concatenation of several portions of Brazilian Portuguese datasets found in the [Hub](https://huggingface.co/datasets?task_categories=task_categories:text-generation&language=language:pt&sort=trending). We utilized some of the filters used in Rae et al. ([2021](https://arxiv.org/abs/2112.11446)), besides using a [fine-tuned BERTimbau](https://huggingface.co/nicholasKluge/ToxicityModelPT) to exclude samples classified above a pre-defined toxicity threshold.
#### Who are the annotators?
[Nicholas Kluge Corrêa](mailto:[email protected]).
### Personal and Sensitive Information
This dataset, sourced from web scraping, may potentially contain personal and sensitive information, alongside offensive, toxic, and disturbing language.
## Considerations for Using the Data
### Social Impact of Dataset
The presence of personal and sensitive information within the dataset raises concerns about privacy and data protection, potentially leading to breaches of individuals' confidentiality and security. Furthermore, the inclusion of offensive, toxic, and disturbing language in the dataset poses risks of perpetuating harmful behaviors and attitudes, contributing to the normalization of hate speech and online toxicity. Therefore, careful handling and ethical considerations are essential to mitigate these potential social impacts and promote responsible dataset use.
### Discussion of Biases
The inclusion of offensive, toxic, and disturbing language in the dataset poses risks of perpetuating harmful behaviors and attitudes, contributing to the normalization of hate speech and online toxicity.
### Other Known Limitations
A significant portion of the data within the dataset has been translated using translation engines, potentially resulting in corrupted samples of both language and code. While useful for quickly converting text between languages, translation engines often struggle with accurately preserving the syntax, semantics, and context of programming languages. As a result, the translated code may contain errors, syntax inconsistencies, or even introduce vulnerabilities, rendering it unreliable or unusable for its intended purpose.
## Additional Information
### Dataset Curators
[Nicholas Kluge Corrêa](mailto:[email protected]).
### Licensing Information
The following datasets (_only training splits are a part of the corpus_) and respective licenses form the Portuguese-Corpus:
- [Wikipedia](https://huggingface.co/datasets/graelo/wikipedia) (License: [CC BY-SA 3.0](https://creativecommons.org/licenses/by-sa/3.0/))
- [CulturaX](https://huggingface.co/datasets/uonlp/CulturaX) (License: [ODC-By](https://opendatacommons.org/licenses/by/1-0/), [cc0-1.0](https://huggingface.co/datasets/oscar-corpus/OSCAR-2301#licensing-information))
- [OSCAR](https://huggingface.co/datasets/eduagarcia/OSCAR-2301-pt_dedup) (License: [cc0-1.0](https://huggingface.co/datasets/oscar-corpus/OSCAR-2301#licensing-information))
- [CCc100](https://huggingface.co/datasets/eduagarcia/cc100-pt) (License: [Common Crawl terms of use](https://commoncrawl.org/terms-of-use/))
- [Roots Wikiquote](https://huggingface.co/datasets/bigscience-data/roots_pt_wikiquote) (License: [CC BY-SA 3.0](https://creativecommons.org/licenses/by-sa/3.0/))
- [Roots Ted Talks](https://huggingface.co/datasets/bigscience-data/roots_pt_ted_talks_iwslt) (License: [CC BY-NC-ND 4.0](https://creativecommons.org/licenses/by-nc-nd/4.0/deed.en))
### Citation Information
```latex
@misc{correa24ttllama,
title = {TeenyTinyLlama: open-source tiny language models trained in Brazilian Portuguese},
author = {Corr{\^e}a, Nicholas Kluge and Falk, Sophia and Fatimah, Shiza and Sen, Aniket and De Oliveira, Nythamar},
journal={arXiv preprint arXiv:2401.16640},
year={2024}
}
```
### Contributions
If you would like to contribute, contact me at [[email protected]](mailto:[email protected])!
| nicholasKluge/Pt-Corpus | [
"task_categories:text-generation",
"size_categories:1M<n<10M",
"language:pt",
"license:other",
"portuguese",
"language-modeling",
"arxiv:2401.16640",
"arxiv:2112.11446",
"region:us"
] | 2024-01-17T13:38:48+00:00 | {"language": ["pt"], "license": "other", "size_categories": ["1M<n<10M"], "task_categories": ["text-generation"], "pretty_name": "Pt-Corpus", "dataset_info": {"features": [{"name": "text", "dtype": "string"}, {"name": "metadata", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 16220765175.988096, "num_examples": 5768246}], "download_size": 11478008666, "dataset_size": 16220765175.988096}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "tags": ["portuguese", "language-modeling"]} | 2024-02-15T18:08:39+00:00 |
0a89d4906cfe9458497124d287d4b1bba081547a | thomasht86/ns3456_3451_clf_v2 | [
"region:us"
] | 2024-01-17T13:42:17+00:00 | {"dataset_info": {"features": [{"name": "text", "dtype": "string"}, {"name": "label", "dtype": "int64"}, {"name": "split", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 103363480, "num_examples": 118557}, {"name": "test", "num_bytes": 25883559, "num_examples": 29700}], "download_size": 115494808, "dataset_size": 129247039}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2024-01-17T13:50:59+00:00 |
|
081222a5e3aedeb04b84abde27abf98e130fe602 | # This is a dataset containing 10,000 posts from Kaggle and 60,000 comments related to those posts in the question-answer topic.
## Data Fields
### kaggle_post
1. 'pseudo', The question authors.
2. 'title', Title of the Post.
3. 'question', The question's body.
4. 'vote', Voting on Kaggle is similar to liking.
5. 'medal', I will share with you the Kaggle medal system, which can be found at https://www.kaggle.com/progression. The system awards medals to users based on their performance.
6. 'nbr_comment', The comment number.
7. 'date', The post date.
8. 'url_post', Link the comment dataset using the post URL.
9. 'url_competition', If the question is related to a competition, include the competition URL.
10. 'rank_competition', The author's rank in the competition.
### kaggle_comment
1. 'pseudo_com', the answer authors.
2. 'answer', The answer's body.
3. 'vote_com', The answer's number of likes
4. 'medal_com', I will share with you the Kaggle medal system, which can be found at https://www.kaggle.com/progression. The system awards medals to users based on their performance.
5. 'date_com', The answer date.
6. 'url_post', Link the comment dataset using the post URL.
7. 'rank_competition', the author's rank in the competition.
Data scraping by Mathieu Duverne on august 2023.
| Raaxx/Kaggle-post-and-comments-question-answer-topic | [
"task_categories:question-answering",
"language:en",
"region:us"
] | 2024-01-17T13:42:26+00:00 | {"language": ["en"], "task_categories": ["question-answering"]} | 2024-01-17T14:27:40+00:00 |
d8d073e9e0f2cf4f25f97ebbd9a5a0238019fa9e | alexpanick/vaiNeymar | [
"license:mit",
"region:us"
] | 2024-01-17T13:44:15+00:00 | {"license": "mit"} | 2024-01-17T13:58:14+00:00 |
|
31298db19f3d7cabf930104c28afc78aa86de3d3 |
# Dataset of elaice/イレース (Fire Emblem)
This is the dataset of elaice/イレース (Fire Emblem), containing 138 images and their tags.
The core tags of this character are `purple_hair, long_hair, purple_eyes, twintails, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 138 | 111.70 MiB | [Download](https://huggingface.co/datasets/CyberHarem/elaice_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 138 | 77.88 MiB | [Download](https://huggingface.co/datasets/CyberHarem/elaice_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 258 | 141.21 MiB | [Download](https://huggingface.co/datasets/CyberHarem/elaice_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 138 | 103.59 MiB | [Download](https://huggingface.co/datasets/CyberHarem/elaice_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 258 | 175.49 MiB | [Download](https://huggingface.co/datasets/CyberHarem/elaice_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/elaice_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 8 |  |  |  |  |  | 1girl, hetero, multiple_penises, solo_focus, mosaic_censoring, gangbang, nipples, vaginal, blush, cum_in_pussy, medium_breasts, 2boys, 3boys, circlet, facial, fellatio, handjob, testicles |
| 1 | 5 |  |  |  |  |  | 1girl, nipples, solo, medium_breasts, open_mouth, blush, completely_nude, navel, artist_name, circlet, food, hair_flower, large_breasts, looking_at_viewer, pussy, signature, simple_background, sitting |
| 2 | 8 |  |  |  |  |  | 1girl, cape, circlet, skirt, solo, low-tied_long_hair, book, simple_background, sitting, white_background |
| 3 | 7 |  |  |  |  |  | 1girl, circlet, full_body, short_sleeves, simple_background, solo, bangs, capelet, hood_down, low_twintails, white_footwear, miniskirt, shiny_hair, white_background, belt_pouch, closed_mouth, purple_skirt, holding_book, jewelry, knee_boots, looking_at_viewer, magic, open_book |
| 4 | 9 |  |  |  |  |  | 1girl, alternate_costume, solo, candy, halloween_costume, holding, long_sleeves, circlet, dress, simple_background, cape, open_mouth, white_background, white_pantyhose, boots, eating, looking_at_viewer, purple_gloves |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | hetero | multiple_penises | solo_focus | mosaic_censoring | gangbang | nipples | vaginal | blush | cum_in_pussy | medium_breasts | 2boys | 3boys | circlet | facial | fellatio | handjob | testicles | solo | open_mouth | completely_nude | navel | artist_name | food | hair_flower | large_breasts | looking_at_viewer | pussy | signature | simple_background | sitting | cape | skirt | low-tied_long_hair | book | white_background | full_body | short_sleeves | bangs | capelet | hood_down | low_twintails | white_footwear | miniskirt | shiny_hair | belt_pouch | closed_mouth | purple_skirt | holding_book | jewelry | knee_boots | magic | open_book | alternate_costume | candy | halloween_costume | holding | long_sleeves | dress | white_pantyhose | boots | eating | purple_gloves |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------|:-------------------|:-------------|:-------------------|:-----------|:----------|:----------|:--------|:---------------|:-----------------|:--------|:--------|:----------|:---------|:-----------|:----------|:------------|:-------|:-------------|:------------------|:--------|:--------------|:-------|:--------------|:----------------|:--------------------|:--------|:------------|:--------------------|:----------|:-------|:--------|:---------------------|:-------|:-------------------|:------------|:----------------|:--------|:----------|:------------|:----------------|:-----------------|:------------|:-------------|:-------------|:---------------|:---------------|:---------------|:----------|:-------------|:--------|:------------|:--------------------|:--------|:--------------------|:----------|:---------------|:--------|:------------------|:--------|:---------|:----------------|
| 0 | 8 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | | | | | | X | | X | | X | | | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 8 |  |  |  |  |  | X | | | | | | | | | | | | | X | | | | | X | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 7 |  |  |  |  |  | X | | | | | | | | | | | | | X | | | | | X | | | | | | | | X | | | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | |
| 4 | 9 |  |  |  |  |  | X | | | | | | | | | | | | | X | | | | | X | X | | | | | | | X | | | X | | X | | | | X | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/elaice_fireemblem | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-17T13:47:37+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-17T14:12:05+00:00 |
660a5b20463b76b2c46f6e404e3f216eb881a3cd |
# Dataset of nepenee/ネフェニー (Fire Emblem)
This is the dataset of nepenee/ネフェニー (Fire Emblem), containing 189 images and their tags.
The core tags of this character are `green_hair, long_hair, green_eyes, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 189 | 199.01 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nepenee_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 189 | 132.06 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nepenee_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 385 | 240.68 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nepenee_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 189 | 185.83 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nepenee_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 385 | 312.57 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nepenee_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/nepenee_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 |  |  |  |  |  | 1girl, blue_armor, breastplate, helmet, solo, spear, thighhighs, blue_eyes, boots, skirt, full_body, holding_weapon, shield, simple_background, belt, white_background, detached_sleeves, looking_at_viewer |
| 1 | 5 |  |  |  |  |  | 1girl, blue_armor, helmet, solo, breastplate, spear, shield, thighhighs, belt |
| 2 | 7 |  |  |  |  |  | 1girl, hetero, solo_focus, vaginal, blush, nipples, rape, armor, helmet, multiple_penises, cum_in_pussy, large_breasts, mosaic_censoring, spread_legs, tears, thighhighs, torn_clothes, 3boys, gangbang, medium_breasts, mmf_threesome, straddling |
| 3 | 7 |  |  |  |  |  | 1boy, 1girl, helmet, hetero, day, large_breasts, nipples, open_mouth, blush, cum_in_pussy, penis, solo_focus, vaginal, bar_censor, blue_armor, breasts_out, clothed_sex, overflow, very_long_hair, anus, blue_sky, outdoors |
| 4 | 6 |  |  |  |  |  | 1girl, day, looking_at_viewer, outdoors, solo, cloud, large_breasts, navel, black_bikini, blue_sky, blush, cleavage, helmet, ocean |
| 5 | 5 |  |  |  |  |  | 1girl, blue_dress, collarbone, helmet, long_sleeves, medium_breasts, solo, veil, wide_sleeves, aqua_eyes, bangs, blue_footwear, full_body, gradient_hair, puffy_sleeves, simple_background, frilled_sleeves, looking_at_viewer, shoes, smile, white_background, arrow_(projectile), bare_shoulders, blue_armor, closed_mouth, detached_sleeves, holding_bow_(weapon), looking_away, standing |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blue_armor | breastplate | helmet | solo | spear | thighhighs | blue_eyes | boots | skirt | full_body | holding_weapon | shield | simple_background | belt | white_background | detached_sleeves | looking_at_viewer | hetero | solo_focus | vaginal | blush | nipples | rape | armor | multiple_penises | cum_in_pussy | large_breasts | mosaic_censoring | spread_legs | tears | torn_clothes | 3boys | gangbang | medium_breasts | mmf_threesome | straddling | 1boy | day | open_mouth | penis | bar_censor | breasts_out | clothed_sex | overflow | very_long_hair | anus | blue_sky | outdoors | cloud | navel | black_bikini | cleavage | ocean | blue_dress | collarbone | long_sleeves | veil | wide_sleeves | aqua_eyes | bangs | blue_footwear | gradient_hair | puffy_sleeves | frilled_sleeves | shoes | smile | arrow_(projectile) | bare_shoulders | closed_mouth | holding_bow_(weapon) | looking_away | standing |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------------|:--------------|:---------|:-------|:--------|:-------------|:------------|:--------|:--------|:------------|:-----------------|:---------|:--------------------|:-------|:-------------------|:-------------------|:--------------------|:---------|:-------------|:----------|:--------|:----------|:-------|:--------|:-------------------|:---------------|:----------------|:-------------------|:--------------|:--------|:---------------|:--------|:-----------|:-----------------|:----------------|:-------------|:-------|:------|:-------------|:--------|:-------------|:--------------|:--------------|:-----------|:-----------------|:-------|:-----------|:-----------|:--------|:--------|:---------------|:-----------|:--------|:-------------|:-------------|:---------------|:-------|:---------------|:------------|:--------|:----------------|:----------------|:----------------|:------------------|:--------|:--------|:---------------------|:-----------------|:---------------|:-----------------------|:---------------|:-----------|
| 0 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | | | | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 7 |  |  |  |  |  | X | | | X | | | X | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 7 |  |  |  |  |  | X | X | | X | | | | | | | | | | | | | | | X | X | X | X | X | | | | X | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 6 |  |  |  |  |  | X | | | X | X | | | | | | | | | | | | | X | | | | X | | | | | | X | | | | | | | | | | | X | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | X | | X | X | | | | | | X | | | X | | X | X | X | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/nepenee_fireemblem | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-17T13:47:39+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-17T14:19:14+00:00 |
11087f376ed45b93d2a93bb9f8efcb4bf78b8c29 |
# Dataset of kazahana/カザハナ (Fire Emblem)
This is the dataset of kazahana/カザハナ (Fire Emblem), containing 127 images and their tags.
The core tags of this character are `long_hair, brown_hair, brown_eyes, headband, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 127 | 144.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kazahana_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 127 | 88.21 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kazahana_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 280 | 175.12 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kazahana_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 127 | 131.33 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kazahana_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 280 | 232.07 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kazahana_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/kazahana_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1boy, 1girl, hetero, penis, sex, blush, navel, nipples, nude, spread_legs, vaginal, anal, bar_censor, cum_in_pussy, fingering, large_breasts, open_mouth, pussy_juice, rape, sweat, tears, clenched_teeth, female_ejaculation, interspecies, medium_breasts, saliva, smile, solo_focus, thighhighs |
| 1 | 9 |  |  |  |  |  | navel, nipples, 1girl, pussy, blush, medium_breasts, female_pubic_hair, completely_nude, smile, solo, hetero, open_mouth, penis, sex, small_breasts, vaginal |
| 2 | 39 |  |  |  |  |  | 1girl, solo, armor, katana, simple_background, smile, japanese_clothes, holding_weapon, open_mouth, thighhighs, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1boy | 1girl | hetero | penis | sex | blush | navel | nipples | nude | spread_legs | vaginal | anal | bar_censor | cum_in_pussy | fingering | large_breasts | open_mouth | pussy_juice | rape | sweat | tears | clenched_teeth | female_ejaculation | interspecies | medium_breasts | saliva | smile | solo_focus | thighhighs | pussy | female_pubic_hair | completely_nude | solo | small_breasts | armor | katana | simple_background | japanese_clothes | holding_weapon | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------|:--------|:---------|:--------|:------|:--------|:--------|:----------|:-------|:--------------|:----------|:-------|:-------------|:---------------|:------------|:----------------|:-------------|:--------------|:-------|:--------|:--------|:-----------------|:---------------------|:---------------|:-----------------|:---------|:--------|:-------------|:-------------|:--------|:--------------------|:------------------|:-------|:----------------|:--------|:---------|:--------------------|:-------------------|:-----------------|:-------------------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | |
| 1 | 9 |  |  |  |  |  | | X | X | X | X | X | X | X | | | X | | | | | | X | | | | | | | | X | | X | | | X | X | X | X | X | | | | | | |
| 2 | 39 |  |  |  |  |  | | X | | | | | | | | | | | | | | | X | | | | | | | | | | X | | X | | | | X | | X | X | X | X | X | X |
| CyberHarem/kazahana_fireemblem | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-17T13:47:47+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-17T14:17:22+00:00 |
e3237a269a0af3c2db102d551e3e22994b62c8d3 |
# Dataset of veronica/ヴェロニカ (Fire Emblem)
This is the dataset of veronica/ヴェロニカ (Fire Emblem), containing 215 images and their tags.
The core tags of this character are `red_eyes, grey_hair, long_hair, hair_ornament, crown, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 215 | 242.27 MiB | [Download](https://huggingface.co/datasets/CyberHarem/veronica_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 215 | 150.73 MiB | [Download](https://huggingface.co/datasets/CyberHarem/veronica_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 467 | 308.67 MiB | [Download](https://huggingface.co/datasets/CyberHarem/veronica_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 215 | 218.96 MiB | [Download](https://huggingface.co/datasets/CyberHarem/veronica_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 467 | 405.87 MiB | [Download](https://huggingface.co/datasets/CyberHarem/veronica_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/veronica_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 8 |  |  |  |  |  | 1girl, simple_background, solo, closed_mouth, looking_at_viewer, white_background, upper_body |
| 1 | 5 |  |  |  |  |  | 1girl, black_armor, simple_background, solo, closed_mouth, shoulder_armor, upper_body, white_background, looking_at_viewer, cape |
| 2 | 16 |  |  |  |  |  | 1girl, black_gloves, cape, shoulder_armor, long_sleeves, solo, black_armor, holding_staff, closed_mouth, simple_background, high_heels |
| 3 | 5 |  |  |  |  |  | 1girl, dress, easter_egg, fake_animal_ears, hair_flower, rabbit_ears, simple_background, solo, white_gloves, open_mouth, see-through, wrist_cuffs, rabbit_tail, cleavage_cutout, holding, twitter_username, white_pantyhose |
| 4 | 15 |  |  |  |  |  | 1girl, nipples, 1boy, blush, hetero, open_mouth, small_breasts, solo_focus, penis, vaginal, sex, bar_censor, cum_in_pussy, navel, on_back, nude, torn_clothes |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | simple_background | solo | closed_mouth | looking_at_viewer | white_background | upper_body | black_armor | shoulder_armor | cape | black_gloves | long_sleeves | holding_staff | high_heels | dress | easter_egg | fake_animal_ears | hair_flower | rabbit_ears | white_gloves | open_mouth | see-through | wrist_cuffs | rabbit_tail | cleavage_cutout | holding | twitter_username | white_pantyhose | nipples | 1boy | blush | hetero | small_breasts | solo_focus | penis | vaginal | sex | bar_censor | cum_in_pussy | navel | on_back | nude | torn_clothes |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:---------------|:--------------------|:-------------------|:-------------|:--------------|:-----------------|:-------|:---------------|:---------------|:----------------|:-------------|:--------|:-------------|:-------------------|:--------------|:--------------|:---------------|:-------------|:--------------|:--------------|:--------------|:------------------|:----------|:-------------------|:------------------|:----------|:-------|:--------|:---------|:----------------|:-------------|:--------|:----------|:------|:-------------|:---------------|:--------|:----------|:-------|:---------------|
| 0 | 8 |  |  |  |  |  | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 16 |  |  |  |  |  | X | X | X | X | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | X | X | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | |
| 4 | 15 |  |  |  |  |  | X | | | | | | | | | | | | | | | | | | | | X | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/veronica_fireemblem | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-17T13:47:50+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-17T14:34:35+00:00 |
6e747aea0d8217e45489f5f4e05d09d1dfbd8eaf | jijivski/mock_mmlu | [
"license:mit",
"region:us"
] | 2024-01-17T13:57:20+00:00 | {"license": "mit"} | 2024-01-19T08:05:45+00:00 |
|
36b3c2d25e64d8491daa1afe2712e13d40f29371 | jijivski/mock_gsm8k | [
"license:mit",
"region:us"
] | 2024-01-17T14:00:22+00:00 | {"license": "mit"} | 2024-01-19T01:55:14+00:00 |
|
1261f8fd538d07e6aa37012e91b6f047f1e36ca8 |
# Dataset of chloe/クロエ (Fire Emblem)
This is the dataset of chloe/クロエ (Fire Emblem), containing 177 images and their tags.
The core tags of this character are `breasts, long_hair, green_eyes, braid, large_breasts, aqua_hair, bangs, earrings, bow, hair_bow, blue_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 177 | 292.74 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chloe_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 177 | 149.01 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chloe_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 440 | 327.31 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chloe_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 177 | 250.17 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chloe_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 440 | 499.71 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chloe_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/chloe_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 |  |  |  |  |  | 1girl, cleavage, elbow_gloves, looking_at_viewer, shoulder_armor, smile, white_gloves, simple_background, solo, upper_body, jewelry, blush, covered_navel, green_hair, white_background |
| 1 | 5 |  |  |  |  |  | 1girl, breastplate, cleavage, elbow_gloves, solo, white_gloves, covered_navel, green_hair, jewelry, looking_at_viewer, open_mouth, shoulder_armor, :d |
| 2 | 9 |  |  |  |  |  | 1girl, elbow_gloves, solo, white_gloves, breastplate, cleavage, looking_at_viewer, smile, jewelry, pegasus_knight_uniform_(fire_emblem), shoulder_armor, holding_polearm, spear, covered_navel |
| 3 | 9 |  |  |  |  |  | 1girl, cleavage, smile, solo, blush, looking_at_viewer, collarbone, necklace, upper_body, green_dress, green_hair, closed_mouth, holding, short_sleeves, simple_background, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | cleavage | elbow_gloves | looking_at_viewer | shoulder_armor | smile | white_gloves | simple_background | solo | upper_body | jewelry | blush | covered_navel | green_hair | white_background | breastplate | open_mouth | :d | pegasus_knight_uniform_(fire_emblem) | holding_polearm | spear | collarbone | necklace | green_dress | closed_mouth | holding | short_sleeves |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------|:---------------|:--------------------|:-----------------|:--------|:---------------|:--------------------|:-------|:-------------|:----------|:--------|:----------------|:-------------|:-------------------|:--------------|:-------------|:-----|:---------------------------------------|:------------------|:--------|:-------------|:-----------|:--------------|:---------------|:----------|:----------------|
| 0 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | X | | X | | X | | X | | X | X | | X | X | X | | | | | | | | | |
| 2 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | | X | | X | | X | | | X | | | X | X | X | | | | | | |
| 3 | 9 |  |  |  |  |  | X | X | | X | | X | | X | X | X | | X | | X | X | | | | | | | X | X | X | X | X | X |
| CyberHarem/chloe_fireemblem | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-17T14:03:28+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-17T14:39:30+00:00 |
bcee32dd3f400311397d1af1111b3b5bbe6db2a6 |
# Dataset of ferry/フュリー (Fire Emblem)
This is the dataset of ferry/フュリー (Fire Emblem), containing 500 images and their tags.
The core tags of this character are `long_hair, green_hair, green_eyes, ponytail, breasts, earrings, bangs, large_breasts, very_long_hair, high_ponytail`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 723.59 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ferry_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 409.00 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ferry_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1211 | 856.20 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ferry_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 643.32 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ferry_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1211 | 1.19 GiB | [Download](https://huggingface.co/datasets/CyberHarem/ferry_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/ferry_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 28 |  |  |  |  |  | 1girl, mask_on_head, official_alternate_costume, solo, chest_sarashi, tube_top, bandages, looking_at_viewer, smile, jewelry, single_bare_shoulder, cleavage, bandeau, midriff, blue_skirt, navel, single_sleeve, stomach, holding_weapon, blue_shirt, polearm, white_background, bare_shoulders, simple_background, collarbone, short_sleeves, standing, thighs |
| 1 | 5 |  |  |  |  |  | 1girl, blue_dress, fingerless_gloves, holding_sword, jewelry, looking_at_viewer, sheath, solo, katana, pelvic_curtain, short_sleeves, thighs, side_slit, black_gloves, boots, smile |
| 2 | 7 |  |  |  |  |  | 1girl, black_gloves, boots, simple_background, white_background, blue_dress, fingerless_gloves, holding_sword, pelvic_curtain, full_body, solo, side_slit |
| 3 | 10 |  |  |  |  |  | 1girl, dress, jewelry, solo, arrow_(projectile), fingerless_gloves, holding_bow_(weapon), white_background, feathers, simple_background, quiver, smile, fur_trim, hair_ornament, looking_at_viewer, pelvic_curtain, short_sleeves, cape, full_body, knee_boots, thighs, belt, closed_mouth, elbow_gloves, medium_breasts, shoulder_armor, standing |
| 4 | 8 |  |  |  |  |  | 1girl, blush, hair_flower, jewelry, official_alternate_costume, solo, bare_shoulders, blue_bikini, cleavage, looking_at_viewer, choker, collarbone, smile, simple_background, closed_mouth, navel, open_mouth, strapless_bikini, white_background |
| 5 | 7 |  |  |  |  |  | 1girl, jewelry, looking_at_viewer, ocean, outdoors, solo, beach, blue_bikini, blue_sky, cleavage, cloud, day, hair_flower, navel, official_alternate_costume, bare_shoulders, smile, strapless_bikini, blush, collarbone, thighs, water, choker, wet |
| 6 | 9 |  |  |  |  |  | bare_shoulders, bride, necklace, strapless_dress, wedding_dress, 1girl, hair_flower, official_alternate_costume, white_dress, cleavage, looking_at_viewer, solo, bouquet, open_mouth, smile, bridal_veil, detached_sleeves, gloves, medium_breasts |
| 7 | 5 |  |  |  |  |  | 1boy, 1girl, blush, hetero, solo_focus, sweat, open_mouth, mosaic_censoring, nipples, cum, erection, jewelry, licking_penis, nude, tongue_out |
| 8 | 11 |  |  |  |  |  | 1girl, blush, hetero, solo_focus, 1boy, penis, sex, open_mouth, nipples, pussy, vaginal, completely_nude, mosaic_censoring, smile, jewelry, spread_legs, sweat, looking_at_viewer, ass, dark-skinned_male, gloves, lying, navel, straddling, thighhighs |
| 9 | 6 |  |  |  |  |  | 1girl, elbow_gloves, armpits, arms_up, blush, looking_at_viewer, solo, thighhighs, white_gloves, armor, open_mouth, pole_dancing, stripper_pole, :d, censored, pussy, spread_legs, sweat, thigh_boots |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | mask_on_head | official_alternate_costume | solo | chest_sarashi | tube_top | bandages | looking_at_viewer | smile | jewelry | single_bare_shoulder | cleavage | bandeau | midriff | blue_skirt | navel | single_sleeve | stomach | holding_weapon | blue_shirt | polearm | white_background | bare_shoulders | simple_background | collarbone | short_sleeves | standing | thighs | blue_dress | fingerless_gloves | holding_sword | sheath | katana | pelvic_curtain | side_slit | black_gloves | boots | full_body | dress | arrow_(projectile) | holding_bow_(weapon) | feathers | quiver | fur_trim | hair_ornament | cape | knee_boots | belt | closed_mouth | elbow_gloves | medium_breasts | shoulder_armor | blush | hair_flower | blue_bikini | choker | open_mouth | strapless_bikini | ocean | outdoors | beach | blue_sky | cloud | day | water | wet | bride | necklace | strapless_dress | wedding_dress | white_dress | bouquet | bridal_veil | detached_sleeves | gloves | 1boy | hetero | solo_focus | sweat | mosaic_censoring | nipples | cum | erection | licking_penis | nude | tongue_out | penis | sex | pussy | vaginal | completely_nude | spread_legs | ass | dark-skinned_male | lying | straddling | thighhighs | armpits | arms_up | white_gloves | armor | pole_dancing | stripper_pole | :d | censored | thigh_boots |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:-----------------------------|:-------|:----------------|:-----------|:-----------|:--------------------|:--------|:----------|:-----------------------|:-----------|:----------|:----------|:-------------|:--------|:----------------|:----------|:-----------------|:-------------|:----------|:-------------------|:-----------------|:--------------------|:-------------|:----------------|:-----------|:---------|:-------------|:--------------------|:----------------|:---------|:---------|:-----------------|:------------|:---------------|:--------|:------------|:--------|:---------------------|:-----------------------|:-----------|:---------|:-----------|:----------------|:-------|:-------------|:-------|:---------------|:---------------|:-----------------|:-----------------|:--------|:--------------|:--------------|:---------|:-------------|:-------------------|:--------|:-----------|:--------|:-----------|:--------|:------|:--------|:------|:--------|:-----------|:------------------|:----------------|:--------------|:----------|:--------------|:-------------------|:---------|:-------|:---------|:-------------|:--------|:-------------------|:----------|:------|:-----------|:----------------|:-------|:-------------|:--------|:------|:--------|:----------|:------------------|:--------------|:------|:--------------------|:--------|:-------------|:-------------|:----------|:----------|:---------------|:--------|:---------------|:----------------|:-----|:-----------|:--------------|
| 0 | 28 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | | | X | | | | X | X | X | | | | | | | | | | | | | | | | X | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 7 |  |  |  |  |  | X | | | X | | | | | | | | | | | | | | | | | | X | | X | | | | | X | X | X | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 10 |  |  |  |  |  | X | | | X | | | | X | X | X | | | | | | | | | | | | X | | X | | X | X | X | | X | | | | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 8 |  |  |  |  |  | X | | X | X | | | | X | X | X | | X | | | | X | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | X | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 7 |  |  |  |  |  | X | | X | X | | | | X | X | X | | X | | | | X | | | | | | | X | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 9 |  |  |  |  |  | X | | X | X | | | | X | X | | | X | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | X | | | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 5 |  |  |  |  |  | X | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | X | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | |
| 8 | 11 |  |  |  |  |  | X | | | | | | | X | X | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | X | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | |
| 9 | 6 |  |  |  |  |  | X | | | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | X | | | | X | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | X | | | X | | | | | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/ferry_fireemblem | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-17T14:03:29+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-17T16:04:38+00:00 |
dbb934ad2aaaf7b0bf90b7083bfdcd64d3941e8c |
# Dataset of aqua (Fire Emblem)
This is the dataset of aqua (Fire Emblem), containing 500 images and their tags.
The core tags of this character are `blue_hair, long_hair, yellow_eyes, hair_between_eyes, very_long_hair, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 700.69 MiB | [Download](https://huggingface.co/datasets/CyberHarem/aqua_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 412.67 MiB | [Download](https://huggingface.co/datasets/CyberHarem/aqua_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1137 | 805.29 MiB | [Download](https://huggingface.co/datasets/CyberHarem/aqua_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 631.90 MiB | [Download](https://huggingface.co/datasets/CyberHarem/aqua_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1137 | 1.07 GiB | [Download](https://huggingface.co/datasets/CyberHarem/aqua_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/aqua_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 19 |  |  |  |  |  | 1girl, elbow_gloves, fingerless_gloves, solo, veil, white_dress, white_gloves, water, open_mouth, looking_at_viewer, necklace, pendant, ribbon |
| 1 | 7 |  |  |  |  |  | 1girl, dress, elbow_gloves, looking_at_viewer, solo, fingerless_gloves, water, simple_background, smile, white_background, jewelry, mouth_veil, open_mouth, white_gloves |
| 2 | 7 |  |  |  |  |  | 1girl, barefoot, elbow_gloves, fingerless_gloves, solo, veil, anklet, looking_at_viewer, white_dress, white_gloves, full_body, simple_background, spear, white_background, holding_weapon, necklace, water |
| 3 | 5 |  |  |  |  |  | 1girl, anklet, barefoot, dress, elbow_gloves, looking_at_viewer, solo, veil, dakimakura_(medium), fingerless_gloves, full_body, on_back, blush, ripples, water |
| 4 | 9 |  |  |  |  |  | 1girl, nipples, solo, completely_nude, looking_at_viewer, navel, medium_breasts, simple_background, white_background, veil, smile |
| 5 | 5 |  |  |  |  |  | 1girl, blush, completely_nude, nipples, solo, collarbone, large_breasts, looking_at_viewer, veil, closed_mouth, medium_breasts, navel, smile, arms_behind_back, curtains, indoors, onsen, partially_submerged, stomach, water, wet |
| 6 | 6 |  |  |  |  |  | 1girl, blush, collarbone, completely_nude, looking_at_viewer, navel, solo, stomach, water, wet, bangs, closed_mouth, day, nipples, outdoors, wading, armpits, cowboy_shot, veil, arm_up, blue_sky, groin, large_breasts, medium_breasts, smile, thighs, tree |
| 7 | 8 |  |  |  |  |  | 1girl, bangs, crop_top, looking_at_viewer, midriff, navel, necklace, solo, veil, cleavage, collarbone, holding, closed_mouth, medium_breasts, official_alternate_costume, smile, blue_skirt, circlet, fire, see-through, standing, thighlet, bare_shoulders, blush, cowboy_shot, full_body, hand_up, large_breasts, light_blue_hair, pantyhose, torch |
| 8 | 9 |  |  |  |  |  | 1boy, 1girl, hetero, penis, solo_focus, veil, blush, nipples, large_breasts, looking_at_viewer, male_pubic_hair, mosaic_censoring, paizuri, pov, fellatio, breasts_squeezed_together |
| 9 | 5 |  |  |  |  |  | 1girl, obi, solo, veil, full_body, hagoita, holding, sandals, simple_background, wide_sleeves, floral_print, hair_ornament, hanetsuki, long_sleeves, looking_at_viewer, open_mouth, blue_kimono, flower, grey_background, hair_tubes, smile, tabi |
| 10 | 9 |  |  |  |  |  | 1girl, cum_in_pussy, hetero, multiple_boys, multiple_penises, nipples, rape, mosaic_censoring, solo_focus, cum_on_body, gangbang, large_breasts, vaginal, veil, blush, asymmetrical_legwear, breast_grab, gloves, grabbing, navel, pantyhose, tears, anal, facial, open_mouth, spread_legs |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | elbow_gloves | fingerless_gloves | solo | veil | white_dress | white_gloves | water | open_mouth | looking_at_viewer | necklace | pendant | ribbon | dress | simple_background | smile | white_background | jewelry | mouth_veil | barefoot | anklet | full_body | spear | holding_weapon | dakimakura_(medium) | on_back | blush | ripples | nipples | completely_nude | navel | medium_breasts | collarbone | large_breasts | closed_mouth | arms_behind_back | curtains | indoors | onsen | partially_submerged | stomach | wet | bangs | day | outdoors | wading | armpits | cowboy_shot | arm_up | blue_sky | groin | thighs | tree | crop_top | midriff | cleavage | holding | official_alternate_costume | blue_skirt | circlet | fire | see-through | standing | thighlet | bare_shoulders | hand_up | light_blue_hair | pantyhose | torch | 1boy | hetero | penis | solo_focus | male_pubic_hair | mosaic_censoring | paizuri | pov | fellatio | breasts_squeezed_together | obi | hagoita | sandals | wide_sleeves | floral_print | hair_ornament | hanetsuki | long_sleeves | blue_kimono | flower | grey_background | hair_tubes | tabi | cum_in_pussy | multiple_boys | multiple_penises | rape | cum_on_body | gangbang | vaginal | asymmetrical_legwear | breast_grab | gloves | grabbing | tears | anal | facial | spread_legs |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:---------------|:--------------------|:-------|:-------|:--------------|:---------------|:--------|:-------------|:--------------------|:-----------|:----------|:---------|:--------|:--------------------|:--------|:-------------------|:----------|:-------------|:-----------|:---------|:------------|:--------|:-----------------|:----------------------|:----------|:--------|:----------|:----------|:------------------|:--------|:-----------------|:-------------|:----------------|:---------------|:-------------------|:-----------|:----------|:--------|:----------------------|:----------|:------|:--------|:------|:-----------|:---------|:----------|:--------------|:---------|:-----------|:--------|:---------|:-------|:-----------|:----------|:-----------|:----------|:-----------------------------|:-------------|:----------|:-------|:--------------|:-----------|:-----------|:-----------------|:----------|:------------------|:------------|:--------|:-------|:---------|:--------|:-------------|:------------------|:-------------------|:----------|:------|:-----------|:----------------------------|:------|:----------|:----------|:---------------|:---------------|:----------------|:------------|:---------------|:--------------|:---------|:------------------|:-------------|:-------|:---------------|:----------------|:-------------------|:-------|:--------------|:-----------|:----------|:-----------------------|:--------------|:---------|:-----------|:--------|:-------|:---------|:--------------|
| 0 | 19 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | X | X | | | X | X | X | X | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | | X | X | | | | X | | X | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | X | X | X | X | | | X | | X | | | | X | | | | | | X | X | X | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 9 |  |  |  |  |  | X | | | X | X | | | | | X | | | | | X | X | X | | | | | | | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | | | X | X | | | X | | X | | | | | | X | | | | | | | | | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 6 |  |  |  |  |  | X | | | X | X | | | X | | X | | | | | | X | | | | | | | | | | | X | | X | X | X | X | X | X | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 8 |  |  |  |  |  | X | | | X | X | | | | | X | X | | | | | X | | | | | | X | | | | | X | | | | X | X | X | X | X | | | | | | | | X | | | | | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 9 |  |  |  |  |  | X | | | | X | | | | | X | | | | | | | | | | | | | | | | | X | | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 9 | 5 |  |  |  |  |  | X | | | X | X | | | | X | X | | | | | X | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | |
| 10 | 9 |  |  |  |  |  | X | | | | X | | | | X | | | | | | | | | | | | | | | | | | X | | X | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | X | | X | | X | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/aqua_fireemblem | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-17T14:03:37+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-17T15:47:47+00:00 |
3d6d648aa9630b243f02a8fed42b1b53c2770eab |
# Dataset of citrinne/シトリニカ (Fire Emblem)
This is the dataset of citrinne/シトリニカ (Fire Emblem), containing 144 images and their tags.
The core tags of this character are `short_hair, blonde_hair, red_eyes, hair_ornament, breasts, bangs, medium_breasts, earrings`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 144 | 234.50 MiB | [Download](https://huggingface.co/datasets/CyberHarem/citrinne_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 144 | 127.24 MiB | [Download](https://huggingface.co/datasets/CyberHarem/citrinne_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 364 | 280.20 MiB | [Download](https://huggingface.co/datasets/CyberHarem/citrinne_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 144 | 207.53 MiB | [Download](https://huggingface.co/datasets/CyberHarem/citrinne_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 364 | 415.15 MiB | [Download](https://huggingface.co/datasets/CyberHarem/citrinne_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/citrinne_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 |  |  |  |  |  | 1girl, cleavage, smile, solo, detached_sleeves, holding_book, looking_at_viewer, blush, brown_dress, necklace, open_mouth |
| 1 | 8 |  |  |  |  |  | 1girl, dress, smile, solo, cleavage, upper_body, bare_shoulders, detached_sleeves, looking_at_viewer, necklace, official_alternate_costume |
| 2 | 7 |  |  |  |  |  | 1girl, looking_at_viewer, navel, solo, long_sleeves, midriff, closed_mouth, crop_top, smile, stomach, arms_up, collarbone, shirt, simple_background, blush, choker, cleavage, on_back, pants, small_breasts, thighhighs, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | cleavage | smile | solo | detached_sleeves | holding_book | looking_at_viewer | blush | brown_dress | necklace | open_mouth | dress | upper_body | bare_shoulders | official_alternate_costume | navel | long_sleeves | midriff | closed_mouth | crop_top | stomach | arms_up | collarbone | shirt | simple_background | choker | on_back | pants | small_breasts | thighhighs | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------|:--------|:-------|:-------------------|:---------------|:--------------------|:--------|:--------------|:-----------|:-------------|:--------|:-------------|:-----------------|:-----------------------------|:--------|:---------------|:----------|:---------------|:-----------|:----------|:----------|:-------------|:--------|:--------------------|:---------|:----------|:--------|:----------------|:-------------|:-------------------|
| 0 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | |
| 1 | 8 |  |  |  |  |  | X | X | X | X | X | | X | | | X | | X | X | X | X | | | | | | | | | | | | | | | | |
| 2 | 7 |  |  |  |  |  | X | X | X | X | | | X | X | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/citrinne_fireemblem | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-17T14:03:42+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-17T14:31:14+00:00 |
d82881b792f70b6f4ac8843f72574f8ee992fdb9 | ksuyash/food-dataset | [
"region:us"
] | 2024-01-17T14:14:30+00:00 | {"dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "label", "dtype": {"class_label": {"names": {"0": "burger", "1": "butter_naan", "2": "chai", "3": "chapati", "4": "chole_bhature", "5": "dal_makhani", "6": "dhokla", "7": "fried_rice", "8": "idli", "9": "jalebi", "10": "kaathi_rolls", "11": "kadai_paneer", "12": "kulfi", "13": "masala_dosa", "14": "momos", "15": "paani_puri", "16": "pakode", "17": "pav_bhaji", "18": "pizza", "19": "samosa"}}}}], "splits": [{"name": "train", "num_bytes": 1716156838.154, "num_examples": 6269}], "download_size": 1579640750, "dataset_size": 1716156838.154}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-18T20:21:06+00:00 |
|
6eeb29000c417da29a4826acfc16d3029f18abf5 | LunarMartins/Voice | [
"license:openrail",
"region:us"
] | 2024-01-17T14:21:33+00:00 | {"license": "openrail"} | 2024-01-30T16:36:52+00:00 |
|
2489c9b2fe3b47416bae5f54937b9549a7d4d89f | reza-alipour/muse-landmark-1500 | [
"region:us"
] | 2024-01-17T14:25:58+00:00 | {"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "image", "dtype": "image"}, {"name": "mask", "dtype": "image"}, {"name": "caption", "dtype": "string"}, {"name": "caption_fre", "dtype": "string"}, {"name": "caption_deu", "dtype": "string"}, {"name": "caption_ita", "dtype": "string"}, {"name": "caption_spa", "dtype": "string"}, {"name": "generated_mask", "dtype": "image"}], "splits": [{"name": "train", "num_bytes": 570015596.25, "num_examples": 1498}], "download_size": 548973105, "dataset_size": 570015596.25}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-17T14:26:37+00:00 |
|
5ca6e369f0e723c9c745452ba9a677b7f1dae159 |
# Dataset of hortensia/オルテンシア (Fire Emblem)
This is the dataset of hortensia/オルテンシア (Fire Emblem), containing 156 images and their tags.
The core tags of this character are `pink_hair, bangs, pink_eyes, breasts, hair_rings, multicolored_hair, facial_mark, bow`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 156 | 247.67 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hortensia_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 156 | 136.94 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hortensia_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 362 | 293.10 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hortensia_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 156 | 214.18 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hortensia_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 362 | 433.80 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hortensia_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/hortensia_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 12 |  |  |  |  |  | 1girl, heart, looking_at_viewer, open_mouth, smile, solo, one_eye_closed, juliet_sleeves, ;d, cleavage, red_rose, upper_body, white_background, simple_background, blush, streaked_hair, medium_breasts, v_over_eye |
| 1 | 6 |  |  |  |  |  | 1girl, juliet_sleeves, looking_at_viewer, red_rose, smile, solo, simple_background, cleavage, heart_tattoo, medium_breasts, open_mouth, upper_body, green_background |
| 2 | 6 |  |  |  |  |  | 1girl, hair_bow, looking_at_viewer, smile, solo, choker, earrings, upper_body, heart_hands, long_sleeves, open_mouth, black_gloves, cleavage, polka_dot_bow, purple_eyes, red_jacket, simple_background, streaked_hair |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | heart | looking_at_viewer | open_mouth | smile | solo | one_eye_closed | juliet_sleeves | ;d | cleavage | red_rose | upper_body | white_background | simple_background | blush | streaked_hair | medium_breasts | v_over_eye | heart_tattoo | green_background | hair_bow | choker | earrings | heart_hands | long_sleeves | black_gloves | polka_dot_bow | purple_eyes | red_jacket |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:--------------------|:-------------|:--------|:-------|:-----------------|:-----------------|:-----|:-----------|:-----------|:-------------|:-------------------|:--------------------|:--------|:----------------|:-----------------|:-------------|:---------------|:-------------------|:-----------|:---------|:-----------|:--------------|:---------------|:---------------|:----------------|:--------------|:-------------|
| 0 | 12 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | | X | X | X | X | | X | | X | X | X | | X | | | X | | X | X | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | | X | X | X | X | | | | X | | X | | X | | X | | | | | X | X | X | X | X | X | X | X | X |
| CyberHarem/hortensia_fireemblem | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-17T14:27:05+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-17T14:55:59+00:00 |
a3f246a51045405acfb831f7496aa6d482feaa8f |
# Dataset of orochi/オロチ (Fire Emblem)
This is the dataset of orochi/オロチ (Fire Emblem), containing 96 images and their tags.
The core tags of this character are `long_hair, breasts, hair_ornament, purple_eyes, purple_hair, large_breasts, earrings`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 96 | 97.11 MiB | [Download](https://huggingface.co/datasets/CyberHarem/orochi_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 96 | 63.31 MiB | [Download](https://huggingface.co/datasets/CyberHarem/orochi_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 227 | 132.04 MiB | [Download](https://huggingface.co/datasets/CyberHarem/orochi_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 96 | 89.03 MiB | [Download](https://huggingface.co/datasets/CyberHarem/orochi_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 227 | 174.79 MiB | [Download](https://huggingface.co/datasets/CyberHarem/orochi_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/orochi_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------|
| 0 | 19 |  |  |  |  |  | 1girl, solo, jewelry, midriff, smile, looking_at_viewer, navel, cleavage, simple_background, bare_shoulders, white_background |
| 1 | 13 |  |  |  |  |  | 1boy, hetero, 1girl, penis, nipples, solo_focus, blush, jewelry, cum_on_breasts, facial, open_mouth, smile, nude, uncensored |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | jewelry | midriff | smile | looking_at_viewer | navel | cleavage | simple_background | bare_shoulders | white_background | 1boy | hetero | penis | nipples | solo_focus | blush | cum_on_breasts | facial | open_mouth | nude | uncensored |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:----------|:----------|:--------|:--------------------|:--------|:-----------|:--------------------|:-----------------|:-------------------|:-------|:---------|:--------|:----------|:-------------|:--------|:-----------------|:---------|:-------------|:-------|:-------------|
| 0 | 19 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | |
| 1 | 13 |  |  |  |  |  | X | | X | | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/orochi_fireemblem | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-17T14:27:14+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-17T14:48:33+00:00 |
bf2ac1c0ce5d2f4e75029a73c5c7d33dbd79f9ce |
# Dataset of rinka/リンカ (Fire Emblem)
This is the dataset of rinka/リンカ (Fire Emblem), containing 142 images and their tags.
The core tags of this character are `dark_skin, dark-skinned_female, white_hair, red_eyes, breasts, facial_mark, muscular_female, mask_on_head, multicolored_hair, red_hair, short_hair, large_breasts, two-tone_hair, medium_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 142 | 200.47 MiB | [Download](https://huggingface.co/datasets/CyberHarem/rinka_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 142 | 110.78 MiB | [Download](https://huggingface.co/datasets/CyberHarem/rinka_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 371 | 237.98 MiB | [Download](https://huggingface.co/datasets/CyberHarem/rinka_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 142 | 175.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/rinka_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 371 | 335.39 MiB | [Download](https://huggingface.co/datasets/CyberHarem/rinka_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/rinka_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 40 |  |  |  |  |  | 1girl, abs, mask, muscular, solo, navel, bandages, beads, necklace, midriff, looking_at_viewer, simple_background, chest_sarashi, club_(weapon), bare_shoulders, biceps |
| 1 | 6 |  |  |  |  |  | 1girl, abs, long_hair, mask, muscular, navel, solo, bandeau, bangs, bare_shoulders, chest_sarashi, fire, looking_at_viewer, midriff, official_alternate_costume, standing, stomach, tube_top, bandages, full_body, simple_background, white_background, clenched_teeth, grin, holding_fan, lantern, pelvic_curtain, thigh_strap, thighs, uchiwa, whisker_markings |
| 2 | 5 |  |  |  |  |  | 1girl, abs, mask, muscular, nipples, pussy, solo, completely_nude, female_pubic_hair, obliques, looking_at_viewer, navel, standing, uncensored, ass_visible_through_thighs, biceps, blush, cowboy_shot, facepaint, outdoors |
| 3 | 8 |  |  |  |  |  | 1girl, hetero, penis, sex, vaginal, 1boy, abs, blush, cum_in_pussy, mask, muscular, nipples, necklace, open_mouth, solo_focus, faceless_male, girl_on_top, light_areolae, navel, nude, spread_legs, ahegao, bandages, bar_censor, beads, bottomless, straddling, sweat, tongue_out |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | abs | mask | muscular | solo | navel | bandages | beads | necklace | midriff | looking_at_viewer | simple_background | chest_sarashi | club_(weapon) | bare_shoulders | biceps | long_hair | bandeau | bangs | fire | official_alternate_costume | standing | stomach | tube_top | full_body | white_background | clenched_teeth | grin | holding_fan | lantern | pelvic_curtain | thigh_strap | thighs | uchiwa | whisker_markings | nipples | pussy | completely_nude | female_pubic_hair | obliques | uncensored | ass_visible_through_thighs | blush | cowboy_shot | facepaint | outdoors | hetero | penis | sex | vaginal | 1boy | cum_in_pussy | open_mouth | solo_focus | faceless_male | girl_on_top | light_areolae | nude | spread_legs | ahegao | bar_censor | bottomless | straddling | sweat | tongue_out |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:------|:-------|:-----------|:-------|:--------|:-----------|:--------|:-----------|:----------|:--------------------|:--------------------|:----------------|:----------------|:-----------------|:---------|:------------|:----------|:--------|:-------|:-----------------------------|:-----------|:----------|:-----------|:------------|:-------------------|:-----------------|:-------|:--------------|:----------|:-----------------|:--------------|:---------|:---------|:-------------------|:----------|:--------|:------------------|:--------------------|:-----------|:-------------|:-----------------------------|:--------|:--------------|:------------|:-----------|:---------|:--------|:------|:----------|:-------|:---------------|:-------------|:-------------|:----------------|:--------------|:----------------|:-------|:--------------|:---------|:-------------|:-------------|:-------------|:--------|:-------------|
| 0 | 40 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | | | X | X | X | X | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | X | X | X | X | X | | | | | X | | | | | X | | | | | | X | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | |
| 3 | 8 |  |  |  |  |  | X | X | X | X | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/rinka_fireemblem | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-17T14:27:18+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-17T15:01:39+00:00 |
1f12faa2123474ed9714302c780952a1bb050876 |
# Dataset of kinu (Fire Emblem)
This is the dataset of kinu (Fire Emblem), containing 262 images and their tags.
The core tags of this character are `animal_ears, blonde_hair, fox_ears, yellow_eyes, fox_tail, multicolored_hair, tail, short_hair, streaked_hair, hair_ornament, breasts, brown_hair, two-tone_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 262 | 282.34 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kinu_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 262 | 167.36 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kinu_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 603 | 338.19 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kinu_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 262 | 251.53 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kinu_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 603 | 466.39 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kinu_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/kinu_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 14 |  |  |  |  |  | 1girl, fur_trim, solo, japanese_clothes, fingerless_gloves, white_gloves, open_mouth, simple_background, smile, white_background |
| 1 | 6 |  |  |  |  |  | 1girl, bangs, full_body, fur_trim, japanese_clothes, long_sleeves, sandals, shiny_hair, simple_background, solo, tabi, white_background, wide_sleeves, fingerless_gloves, smile, thigh_strap, looking_at_viewer |
| 2 | 26 |  |  |  |  |  | 1girl, nipples, solo, blush, navel, censored, completely_nude, animal_ear_fluff, looking_at_viewer, open_mouth, smile, medium_breasts, spread_legs, fox_girl, pussy_juice, spread_pussy, anus |
| 3 | 8 |  |  |  |  |  | 1boy, 1girl, blush, cum_in_pussy, hetero, nipples, solo_focus, vaginal, open_mouth, animal_ear_fluff, bar_censor, medium_breasts, penis, spread_legs, female_pubic_hair, japanese_clothes, navel, overflow, bottomless, breasts_out, clothed_sex, cowgirl_position, fang, fingerless_gloves, heart-shaped_pupils, indoors, looking_at_viewer, smile, white_gloves |
| 4 | 5 |  |  |  |  |  | 1boy, 1girl, hetero, open_mouth, blush, nipples, sex_from_behind, solo_focus, doggystyle, large_breasts, nude, ahegao, closed_eyes, cum_on_body, saliva, sheet_grab, tears, tongue_out |
| 5 | 6 |  |  |  |  |  | 1girl, solo, open_mouth, smile, looking_at_viewer, navel, hair_flower, large_breasts, red_bikini, side-tie_bikini_bottom |
| 6 | 6 |  |  |  |  |  | 1boy, 1girl, blush, fellatio, hetero, penis, solo_focus, heart, animal_ear_fluff, cum_in_mouth, large_breasts, uncensored |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | fur_trim | solo | japanese_clothes | fingerless_gloves | white_gloves | open_mouth | simple_background | smile | white_background | bangs | full_body | long_sleeves | sandals | shiny_hair | tabi | wide_sleeves | thigh_strap | looking_at_viewer | nipples | blush | navel | censored | completely_nude | animal_ear_fluff | medium_breasts | spread_legs | fox_girl | pussy_juice | spread_pussy | anus | 1boy | cum_in_pussy | hetero | solo_focus | vaginal | bar_censor | penis | female_pubic_hair | overflow | bottomless | breasts_out | clothed_sex | cowgirl_position | fang | heart-shaped_pupils | indoors | sex_from_behind | doggystyle | large_breasts | nude | ahegao | closed_eyes | cum_on_body | saliva | sheet_grab | tears | tongue_out | hair_flower | red_bikini | side-tie_bikini_bottom | fellatio | heart | cum_in_mouth | uncensored |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------|:-------|:-------------------|:--------------------|:---------------|:-------------|:--------------------|:--------|:-------------------|:--------|:------------|:---------------|:----------|:-------------|:-------|:---------------|:--------------|:--------------------|:----------|:--------|:--------|:-----------|:------------------|:-------------------|:-----------------|:--------------|:-----------|:--------------|:---------------|:-------|:-------|:---------------|:---------|:-------------|:----------|:-------------|:--------|:--------------------|:-----------|:-------------|:--------------|:--------------|:-------------------|:-------|:----------------------|:----------|:------------------|:-------------|:----------------|:-------|:---------|:--------------|:--------------|:---------|:-------------|:--------|:-------------|:--------------|:-------------|:-------------------------|:-----------|:--------|:---------------|:-------------|
| 0 | 14 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | X | X | X | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 26 |  |  |  |  |  | X | | X | | | | X | | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 8 |  |  |  |  |  | X | | | X | X | X | X | | X | | | | | | | | | | X | X | X | X | | | X | X | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | | | | | | X | | | | | | | | | | | | | X | X | | | | | | | | | | | X | | X | X | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | |
| 5 | 6 |  |  |  |  |  | X | | X | | | | X | | X | | | | | | | | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | X | X | X | | | | |
| 6 | 6 |  |  |  |  |  | X | | | | | | | | | | | | | | | | | | | | X | | | | X | | | | | | | X | | X | X | | | X | | | | | | | | | | | | X | | | | | | | | | | | | X | X | X | X |
| CyberHarem/kinu_fireemblem | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-17T14:27:20+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-17T15:21:54+00:00 |
98fb3faf31d39681abcdd1b66e3c6ed7deb4bb2f | kraitans21/test-dataset-sample | [
"region:us"
] | 2024-01-17T14:32:14+00:00 | {"dataset_info": {"features": [{"name": "instruction", "dtype": "string"}, {"name": "input", "dtype": "string"}, {"name": "output", "dtype": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 17031325.910541903, "num_examples": 10000}, {"name": "eval", "num_bytes": 8515662.955270952, "num_examples": 5000}], "download_size": 14075691, "dataset_size": 25546988.865812853}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "eval", "path": "data/eval-*"}]}]} | 2024-01-17T15:15:31+00:00 |
|
f1c5b0bd67845178c90fa1b3c5299b207cc7f1aa | # Dataset Card for "SemEval_traindata_emotions"
Как был получен
```python
from datasets import load_dataset
import datasets
from torchvision.io import read_video
import json
import torch
import os
from torch.utils.data import Dataset, DataLoader
import tqdm
dataset_path = "./SemEval-2024_Task3/training_data/Subtask_2_train.json"
dataset = json.loads(open(dataset_path).read())
print(len(dataset))
all_conversations = []
for item in dataset:
all_conversations.extend(item["conversation"])
print(len(all_conversations))
all_data = datasets.Dataset.from_list(all_conversations)
all_data = all_data.train_test_split(
test_size=0.08,
seed=42,
)
all_data.push_to_hub(
"dim/SemEval_training_data_emotions",
token=open("./hf_token").read(),
)
``` | dim/SemEval_training_data_emotions | [
"region:us"
] | 2024-01-17T14:36:07+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "utterance_ID", "dtype": "int64"}, {"name": "text", "dtype": "string"}, {"name": "speaker", "dtype": "string"}, {"name": "emotion", "dtype": "string"}, {"name": "video_name", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1198989.1453851238, "num_examples": 12529}, {"name": "test", "num_bytes": 104309.85461487627, "num_examples": 1090}], "download_size": 614184, "dataset_size": 1303299.0}} | 2024-01-17T14:41:17+00:00 |
aea29053cfe67b10e19c0e639e538d0e77e9d5f3 | modelloosrvcc/datasetexemplo | [
"license:openrail",
"region:us"
] | 2024-01-17T14:36:40+00:00 | {"license": "openrail"} | 2024-01-17T14:40:49+00:00 |
|
cfdd09d098472e44bf09334ef47cd7e5e406ec3e |
# Dataset of aversa/インバース (Fire Emblem)
This is the dataset of aversa/インバース (Fire Emblem), containing 62 images and their tags.
The core tags of this character are `long_hair, breasts, white_hair, facial_mark, dark-skinned_female, dark_skin, large_breasts, red_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 62 | 75.65 MiB | [Download](https://huggingface.co/datasets/CyberHarem/aversa_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 62 | 43.18 MiB | [Download](https://huggingface.co/datasets/CyberHarem/aversa_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 131 | 83.90 MiB | [Download](https://huggingface.co/datasets/CyberHarem/aversa_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 62 | 66.79 MiB | [Download](https://huggingface.co/datasets/CyberHarem/aversa_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 131 | 120.03 MiB | [Download](https://huggingface.co/datasets/CyberHarem/aversa_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/aversa_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 37 |  |  |  |  |  | 1girl, solo, cleavage, looking_at_viewer, thighhighs, smile, nail_polish, simple_background, bridal_gauntlets, navel, black_nails, book, dress, jewelry, long_fingernails, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | cleavage | looking_at_viewer | thighhighs | smile | nail_polish | simple_background | bridal_gauntlets | navel | black_nails | book | dress | jewelry | long_fingernails | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:-----------|:--------------------|:-------------|:--------|:--------------|:--------------------|:-------------------|:--------|:--------------|:-------|:--------|:----------|:-------------------|:-------------------|
| 0 | 37 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/aversa_fireemblem | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-17T14:36:57+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-17T14:48:04+00:00 |
52c34f921753108dd3a21d8b1155972ed6e81bc5 |
# Dataset of nono (Fire Emblem)
This is the dataset of nono (Fire Emblem), containing 405 images and their tags.
The core tags of this character are `long_hair, pointy_ears, purple_eyes, green_hair, ahoge, ponytail, breasts, blonde_hair, small_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 405 | 446.67 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nono_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 405 | 265.03 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nono_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 891 | 531.38 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nono_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 405 | 398.41 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nono_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 891 | 727.52 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nono_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/nono_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 61 |  |  |  |  |  | 1girl, cape, solo, gloves, smile, circlet, navel, open_mouth, midriff, looking_at_viewer, shorts, belt, garter_straps, boots, simple_background, flat_chest, pink_bow, pink_thighhighs |
| 1 | 25 |  |  |  |  |  | 1girl, solo, witch_hat, circlet, open_mouth, shorts, smile, halloween_costume, navel, midriff, sleeves_past_wrists, bow, looking_at_viewer, simple_background, belt, wide_sleeves, alternate_costume, broom_riding, boots, full_body |
| 2 | 9 |  |  |  |  |  | 1girl, blush, cum_in_pussy, hetero, navel, penis, solo_focus, 1boy, circlet, gloves, nipples, sex, vaginal, open_mouth, spread_legs, thighhighs, mosaic_censoring, smile, boots, cape, tears |
| 3 | 7 |  |  |  |  |  | 1boy, 1girl, blush, circlet, fellatio, hetero, penis, solo_focus, cum_in_mouth, heart, one_eye_closed, pov, loli, mosaic_censoring, simple_background, witch_hat |
| 4 | 5 |  |  |  |  |  | 1boy, 1girl, circlet, hetero, paizuri, penis, solo_focus, blush, ejaculation, nipples, gloves, open_mouth, uncensored, cum_on_breasts, heart, huge_breasts, jewelry, nude, simple_background, smile, tongue_out |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | cape | solo | gloves | smile | circlet | navel | open_mouth | midriff | looking_at_viewer | shorts | belt | garter_straps | boots | simple_background | flat_chest | pink_bow | pink_thighhighs | witch_hat | halloween_costume | sleeves_past_wrists | bow | wide_sleeves | alternate_costume | broom_riding | full_body | blush | cum_in_pussy | hetero | penis | solo_focus | 1boy | nipples | sex | vaginal | spread_legs | thighhighs | mosaic_censoring | tears | fellatio | cum_in_mouth | heart | one_eye_closed | pov | loli | paizuri | ejaculation | uncensored | cum_on_breasts | huge_breasts | jewelry | nude | tongue_out |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:-------|:---------|:--------|:----------|:--------|:-------------|:----------|:--------------------|:---------|:-------|:----------------|:--------|:--------------------|:-------------|:-----------|:------------------|:------------|:--------------------|:----------------------|:------|:---------------|:--------------------|:---------------|:------------|:--------|:---------------|:---------|:--------|:-------------|:-------|:----------|:------|:----------|:--------------|:-------------|:-------------------|:--------|:-----------|:---------------|:--------|:-----------------|:------|:-------|:----------|:--------------|:-------------|:-----------------|:---------------|:----------|:-------|:-------------|
| 0 | 61 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 25 |  |  |  |  |  | X | | X | | X | X | X | X | X | X | X | X | | X | X | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 9 |  |  |  |  |  | X | X | | X | X | X | X | X | | | | | | X | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | |
| 3 | 7 |  |  |  |  |  | X | | | | | X | | | | | | | | | X | | | | X | | | | | | | | X | | X | X | X | X | | | | | | X | | X | X | X | X | X | X | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | | | X | X | X | | X | | | | | | | X | | | | | | | | | | | | X | | X | X | X | X | X | | | | | | | | | X | | | | X | X | X | X | X | X | X | X |
| CyberHarem/nono_fireemblem | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-17T14:37:05+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-17T15:55:43+00:00 |
3dc42886bdb38338e6cc03b5093c28b5b5694d0b | # Dataset Card for "Vietnamese-Book-Corpus"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | tmnam20/Vietnamese-Book-Corpus | [
"region:us"
] | 2024-01-17T14:38:52+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 3716689262, "num_examples": 16407}], "download_size": 1923451913, "dataset_size": 3716689262}} | 2024-01-17T14:44:27+00:00 |
4c65870d6d8e0576a555ad8ed26bc5ae02c86896 |
# Dataset of selena (Fire Emblem)
This is the dataset of selena (Fire Emblem), containing 358 images and their tags.
The core tags of this character are `long_hair, red_hair, twintails, red_eyes, breasts, very_long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 358 | 374.74 MiB | [Download](https://huggingface.co/datasets/CyberHarem/selena_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 358 | 233.97 MiB | [Download](https://huggingface.co/datasets/CyberHarem/selena_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 779 | 449.51 MiB | [Download](https://huggingface.co/datasets/CyberHarem/selena_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 358 | 340.32 MiB | [Download](https://huggingface.co/datasets/CyberHarem/selena_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 779 | 598.46 MiB | [Download](https://huggingface.co/datasets/CyberHarem/selena_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/selena_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 7 |  |  |  |  |  | 1girl, armor, boots, fingerless_gloves, full_body, sword, simple_background, solo, bent_over, detached_sleeves, leaning_forward, looking_at_viewer, pants, sheathed, bare_shoulders, belt, white_background, bangs, closed_mouth |
| 1 | 9 |  |  |  |  |  | 1girl, nipples, blush, looking_at_viewer, navel, completely_nude, solo, large_breasts, collarbone, medium_breasts, pussy, bangs, hair_between_eyes, hair_ribbon |
| 2 | 8 |  |  |  |  |  | 1girl, barefoot, nipples, solo, completely_nude, full_body, medium_breasts, navel, standing, looking_at_viewer, outdoors, profile, running |
| 3 | 5 |  |  |  |  |  | 1girl, nipples, nude, solo, navel, small_breasts, medium_breasts, simple_background, uncensored, white_background, blush, brown_eyes, crossed_arms, pussy_juice, ribbon |
| 4 | 22 |  |  |  |  |  | 1girl, blush, hetero, 1boy, penis, solo_focus, sex, nude, vaginal, nipples, open_mouth, uncensored, navel, gloves, medium_breasts, cowgirl_position, cum_in_pussy, pov |
| 5 | 5 |  |  |  |  |  | 2girls, open_mouth, yuri, blush, tongue_out, nipples, 1boy, blue_hair, completely_nude, cunnilingus, heart-shaped_pupils, large_breasts, licking, saliva, sweat |
| 6 | 30 |  |  |  |  |  | fake_animal_ears, rabbit_ears, 1girl, pantyhose, playboy_bunny, solo, leotard, cleavage, looking_at_viewer, choker, blush, medium_breasts, alternate_costume, bare_shoulders, simple_background, white_gloves, easter_egg, hair_ornament |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | armor | boots | fingerless_gloves | full_body | sword | simple_background | solo | bent_over | detached_sleeves | leaning_forward | looking_at_viewer | pants | sheathed | bare_shoulders | belt | white_background | bangs | closed_mouth | nipples | blush | navel | completely_nude | large_breasts | collarbone | medium_breasts | pussy | hair_between_eyes | hair_ribbon | barefoot | standing | outdoors | profile | running | nude | small_breasts | uncensored | brown_eyes | crossed_arms | pussy_juice | ribbon | hetero | 1boy | penis | solo_focus | sex | vaginal | open_mouth | gloves | cowgirl_position | cum_in_pussy | pov | 2girls | yuri | tongue_out | blue_hair | cunnilingus | heart-shaped_pupils | licking | saliva | sweat | fake_animal_ears | rabbit_ears | pantyhose | playboy_bunny | leotard | cleavage | choker | alternate_costume | white_gloves | easter_egg | hair_ornament |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:--------|:--------------------|:------------|:--------|:--------------------|:-------|:------------|:-------------------|:------------------|:--------------------|:--------|:-----------|:-----------------|:-------|:-------------------|:--------|:---------------|:----------|:--------|:--------|:------------------|:----------------|:-------------|:-----------------|:--------|:--------------------|:--------------|:-----------|:-----------|:-----------|:----------|:----------|:-------|:----------------|:-------------|:-------------|:---------------|:--------------|:---------|:---------|:-------|:--------|:-------------|:------|:----------|:-------------|:---------|:-------------------|:---------------|:------|:---------|:-------|:-------------|:------------|:--------------|:----------------------|:----------|:---------|:--------|:-------------------|:--------------|:------------|:----------------|:----------|:-----------|:---------|:--------------------|:---------------|:-------------|:----------------|
| 0 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 9 |  |  |  |  |  | X | | | | | | | X | | | | X | | | | | | X | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 8 |  |  |  |  |  | X | | | | X | | | X | | | | X | | | | | | | | X | | X | X | | | X | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | | | | | | X | X | | | | | | | | | X | | | X | X | X | | | | X | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 22 |  |  |  |  |  | X | | | | | | | | | | | | | | | | | | | X | X | X | | | | X | | | | | | | | | X | | X | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | | | | | | | | | | | | | | | | | | | | X | X | | X | X | | | | | | | | | | | | | | | | | | | X | | | | | X | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | |
| 6 | 30 |  |  |  |  |  | X | | | | | | X | X | | | | X | | | X | | | | | | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/selena_fireemblem | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-17T14:39:14+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-17T16:08:48+00:00 |
c3f2fc2b8d17bcac06155506c17744e609653a22 |
# Autocast
This is the Autocast dataset from the paper "[Forecasting Future World Events with Neural Networks](http://arxiv.org/abs/2206.15474)" by [Andy Zou](https://andyzoujm.github.io/), [Tristan Xiao](https://www.linkedin.com/in/tristan-xiao/), [Ryan Jia](https://www.linkedin.com/in/ryanjia/), [Joe Kwon](joekwon.io), [Mantas Mazeika](https://www.linkedin.com/in/mmazeika/), [Richard Li](https://www.linkedin.com/in/lirichard23/), [Dawn Song](https://people.eecs.berkeley.edu/~dawnsong/), [Jacob Steinhardt](https://www.stat.berkeley.edu/~jsteinhardt/), [Owain Evans](https://owainevans.github.io/), and [Dan Hendrycks](https://danhendrycks.com/).
The original dataset files are:
- `autocast_questions.json`
- `autocast_competition_test_set.json`
- `negated_tf_questions.json`
We have also processed the dataset to filter out source links with:
- URLs returning non-200 HTTP status codes
- URLs from sites that are difficult to scrape like twitter, bloomberg
- Links with less than 1000 words are removed.
Only samples with a minimum of 5 working URLs are retained. The maximum number of working source links is 20.
The refined dataset files are:
- `autocast_questions_filtered.json` - a JSON subset of the initial autocast dataset.
- `autocast_questions_filtered.pkl` - a pickle file mapping URLs to the scraped data.
- `retrieved_docs.pkl` - this contains all texts that were retrieved.
<img align="center" src="assets/splash.png" width="750">
# Forecasting Future World Events with Neural Networks
## Introduction
Forecasting future world events is a challenging but valuable task. Forecasts of climate, geopolitical conflict, pandemics and economic indicators help shape policy and decision making. In these domains, the judgment of expert humans contributes to the best forecasts. Given advances in language modeling, can these forecasts be automated? To this end, we introduce Autocast, a dataset containing thousands of forecasting questions and an accompanying news corpus. Questions are taken from forecasting tournaments, ensuring high quality, real-world importance, and diversity. The news corpus is organized by date, allowing us to precisely simulate the conditions under which humans made past forecasts (avoiding leakage from the future). We test language models on our forecasting task and find that performance is far below a human expert baseline. However, performance improves with increased model size and incorporation of relevant information from the news corpus. In sum, Autocast poses a novel challenge for large language models and improved performance could bring large practical benefits.
## Autocast Dataset
The original [Autocast dataset can be downloaded here](https://people.eecs.berkeley.edu/~hendrycks/autocast.tar.gz). For more details on how to use the Autocast dataset and news articles, please refer to our short demonstration in `usage.ipynb`.
Each question has the following fields:
```json
{
"id": "unique identifier (str)",
"question": "question body (str)",
"background": "question context/details (str)",
"qtype": "question type (str)",
"status": "question status (str)",
"choices": "choices or possible ranges (List or Dict)",
"answer": "question resolution (str or float)",
"crowd": "human crowd forecasts over time (List)",
"publish_time": "publish timestamp (str)",
"close_time": "close timestamp (str)",
"prediction_count": "number of crowd predictions (int)",
"forecaster_count": "number of crowd forecasters (int)",
"tags": "question category (List)",
"source_links": "source links from comments (List)"
}
```
The original authors obtained permission from [Metaculus](https://www.metaculus.com/) to host the dataset on GitHub for research purposes only.
## IntervalQA Dataset
Motivated by the difficulty of forecasting numbers across orders of magnitude (e.g. global cases of COVID-19 in 2022), the original authors also curate IntervalQA, a dataset of numerical questions and metrics for calibration.
[Download the IntervalQA dataset here](https://people.eecs.berkeley.edu/~hendrycks/intervalqa.tar.gz).
## Citation
If you find this useful in your research, please consider citing the original authors:
@article{zouforecasting2022,
title={Forecasting Future World Events with Neural Networks},
author={Andy Zou and Tristan Xiao and Ryan Jia and Joe Kwon and Mantas Mazeika and Richard Li and Dawn Song and Jacob Steinhardt and Owain Evans and Dan Hendrycks},
journal={NeurIPS},
year={2022}
}
| valory/autocast | [
"arxiv:2206.15474",
"region:us"
] | 2024-01-17T14:39:53+00:00 | {} | 2024-02-05T20:39:07+00:00 |
b89b5aa89f8c9a69ee541b28daf6d43424c15599 | biki96/hf-stack-v1 | [
"region:us"
] | 2024-01-17T14:45:28+00:00 | {"dataset_info": {"features": [{"name": "repo_id", "dtype": "string"}, {"name": "file_path", "dtype": "string"}, {"name": "content", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 117582122, "num_examples": 7362}], "download_size": 39848159, "dataset_size": 117582122}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-17T14:47:41+00:00 |
|
4d8c1a5952026eda300fc702d5c52891251296f4 | DiogoAvalos/claudioduarte2 | [
"license:openrail",
"region:us"
] | 2024-01-17T14:47:23+00:00 | {"license": "openrail"} | 2024-01-17T14:47:24+00:00 |
|
78a77a8650d8ed1412c6bb7255bfb7e36dba7fab |
# Dataset of cellica (Fire Emblem)
This is the dataset of cellica (Fire Emblem), containing 500 images and their tags.
The core tags of this character are `long_hair, red_hair, red_eyes, breasts, earrings, bangs, hairband`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 571.72 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cellica_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 359.18 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cellica_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1080 | 696.24 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cellica_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 518.45 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cellica_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1080 | 918.64 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cellica_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/cellica_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, cape, fingerless_gloves, jewelry, looking_at_viewer, simple_background, smile, solo, tiara, armor, bare_shoulders, detached_collar, black_gloves, white_dress |
| 1 | 7 |  |  |  |  |  | 1girl, bare_shoulders, looking_at_viewer, simple_background, solo, tiara, upper_body, armor, detached_collar, jewelry, smile, white_dress, closed_mouth, cape, cleavage, medium_breasts, white_background |
| 2 | 14 |  |  |  |  |  | 1girl, cape, dress, jewelry, solo, armor, fingerless_gloves, holding_sword, simple_background, smile, tiara, black_thighhighs, looking_at_viewer, zettai_ryouiki, detached_collar, white_background, bare_shoulders, boots, full_body, black_gloves, cowboy_shot |
| 3 | 5 |  |  |  |  |  | 1girl, navel, nipples, smile, solo, completely_nude, large_breasts, looking_at_viewer, medium_breasts, pussy, collarbone, outdoors, standing, thighs, water, blush, day, jewelry, nature, wading, yellow_eyes |
| 4 | 8 |  |  |  |  |  | 1girl, jewelry, solo_focus, thighhighs, hetero, open_mouth, tiara, 1boy, blush, breasts_out, clothed_sex, cowgirl_position, cum_in_pussy, girl_on_top, nipples, penis, vaginal, armor, black_gloves, fingerless_gloves, large_breasts, spread_legs, cape, detached_collar, medium_breasts, sweat, bar_censor, dress_lift, dress_pull, looking_at_viewer, white_dress |
| 5 | 7 |  |  |  |  |  | 1boy, 1girl, hetero, penis, solo_focus, bar_censor, fellatio, nipples, jewelry, nude, blush, large_breasts, medium_breasts, testicles |
| 6 | 12 |  |  |  |  |  | 1boy, 1girl, hetero, nipples, blush, solo_focus, sex, large_breasts, open_mouth, penis, vaginal, cum_in_pussy, navel, bar_censor, completely_nude, tiara, jewelry, sweat |
| 7 | 5 |  |  |  |  |  | 1girl, barefoot, large_breasts, nipples, solo, arms_behind_back, blush, bondage, feet, looking_at_viewer, navel, rope, toes, completely_nude, pussy_juice, restrained, shibari, smile, spread_legs, sweat, thighs, clitoris, closed_mouth, jewelry, mosaic_censoring, squatting, uncensored |
| 8 | 7 |  |  |  |  |  | 1girl, blush, hetero, nipples, thighhighs, mmf_threesome, multiple_penises, anal, dark-skinned_male, nude, open_mouth, vaginal, 2boys, ass, blunt_bangs, double_penetration, interracial, jewelry, large_breasts, tongue_out, ahegao, cum, faceless_male, gloves, medium_breasts, mosaic_censoring, pussy, solo_focus, sweat, tiara |
| 9 | 5 |  |  |  |  |  | 1girl, navel, smile, beach, cleavage, cloud, hair_flower, looking_at_viewer, solo, white_bikini, alternate_costume, blue_sky, day, jewelry, medium_breasts, ocean, open_mouth, outdoors, collarbone, sitting, thighs, water |
| 10 | 6 |  |  |  |  |  | 1girl, bondage, gagged, rope, arms_behind_back, blush, solo, improvised_gag, jewelry, black_thighhighs, cleavage, large_breasts, medium_breasts, navel, orange_hair, panties, shibari |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | cape | fingerless_gloves | jewelry | looking_at_viewer | simple_background | smile | solo | tiara | armor | bare_shoulders | detached_collar | black_gloves | white_dress | upper_body | closed_mouth | cleavage | medium_breasts | white_background | dress | holding_sword | black_thighhighs | zettai_ryouiki | boots | full_body | cowboy_shot | navel | nipples | completely_nude | large_breasts | pussy | collarbone | outdoors | standing | thighs | water | blush | day | nature | wading | yellow_eyes | solo_focus | thighhighs | hetero | open_mouth | 1boy | breasts_out | clothed_sex | cowgirl_position | cum_in_pussy | girl_on_top | penis | vaginal | spread_legs | sweat | bar_censor | dress_lift | dress_pull | fellatio | nude | testicles | sex | barefoot | arms_behind_back | bondage | feet | rope | toes | pussy_juice | restrained | shibari | clitoris | mosaic_censoring | squatting | uncensored | mmf_threesome | multiple_penises | anal | dark-skinned_male | 2boys | ass | blunt_bangs | double_penetration | interracial | tongue_out | ahegao | cum | faceless_male | gloves | beach | cloud | hair_flower | white_bikini | alternate_costume | blue_sky | ocean | sitting | gagged | improvised_gag | orange_hair | panties |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:-------|:--------------------|:----------|:--------------------|:--------------------|:--------|:-------|:--------|:--------|:-----------------|:------------------|:---------------|:--------------|:-------------|:---------------|:-----------|:-----------------|:-------------------|:--------|:----------------|:-------------------|:-----------------|:--------|:------------|:--------------|:--------|:----------|:------------------|:----------------|:--------|:-------------|:-----------|:-----------|:---------|:--------|:--------|:------|:---------|:---------|:--------------|:-------------|:-------------|:---------|:-------------|:-------|:--------------|:--------------|:-------------------|:---------------|:--------------|:--------|:----------|:--------------|:--------|:-------------|:-------------|:-------------|:-----------|:-------|:------------|:------|:-----------|:-------------------|:----------|:-------|:-------|:-------|:--------------|:-------------|:----------|:-----------|:-------------------|:------------|:-------------|:----------------|:-------------------|:-------|:--------------------|:--------|:------|:--------------|:---------------------|:--------------|:-------------|:---------|:------|:----------------|:---------|:--------|:--------|:--------------|:---------------|:--------------------|:-----------|:--------|:----------|:---------|:-----------------|:--------------|:----------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | | X | X | X | X | X | X | X | X | X | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 14 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | | | X | X | | X | X | | | | | | | | | | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 8 |  |  |  |  |  | X | X | X | X | X | | | | X | X | | X | X | X | | | | X | | | | | | | | | | X | | X | | | | | | | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 7 |  |  |  |  |  | X | | | X | | | | | | | | | | | | | | X | | | | | | | | | | X | | X | | | | | | | X | | | | | X | | X | | X | | | | | | X | | | | X | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 12 |  |  |  |  |  | X | | | X | | | | | X | | | | | | | | | | | | | | | | | | X | X | X | X | | | | | | | X | | | | | X | | X | X | X | | | | X | | X | X | | X | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 5 |  |  |  |  |  | X | | | X | X | | X | X | | | | | | | | X | | | | | | | | | | | X | X | X | X | | | | | X | | X | | | | | | | | | | | | | | | | | X | X | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 7 |  |  |  |  |  | X | | | X | | | | | X | | | | | | | | | X | | | | | | | | | | X | | X | X | | | | | | X | | | | | X | X | X | X | | | | | | | | X | | X | | | | | X | | | | | | | | | | | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 9 | 5 |  |  |  |  |  | X | | | X | X | | X | X | | | | | | | | | X | X | | | | | | | | | X | | | | | X | X | | X | X | | X | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | | | | |
| 10 | 6 |  |  |  |  |  | X | | | X | | | | X | | | | | | | | | X | X | | | | X | | | | | X | | | X | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X |
| CyberHarem/cellica_fireemblem | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-17T14:48:30+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-17T16:36:58+00:00 |
bd72953c0d3a2aafff3a3945bdcf9a5b58d5b141 |
# Dataset of elice/エリス (Fire Emblem)
This is the dataset of elice/エリス (Fire Emblem), containing 500 images and their tags.
The core tags of this character are `blue_hair, blue_eyes, long_hair, hair_between_eyes, breasts, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 652.44 MiB | [Download](https://huggingface.co/datasets/CyberHarem/elice_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 370.28 MiB | [Download](https://huggingface.co/datasets/CyberHarem/elice_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1132 | 754.38 MiB | [Download](https://huggingface.co/datasets/CyberHarem/elice_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 574.49 MiB | [Download](https://huggingface.co/datasets/CyberHarem/elice_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1132 | 1.05 GiB | [Download](https://huggingface.co/datasets/CyberHarem/elice_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/elice_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, cape, falchion_(fire_emblem), fingerless_gloves, solo, sword, tiara, smile, armor, blush, looking_at_viewer |
| 1 | 12 |  |  |  |  |  | 1girl, cape, falchion_(fire_emblem), solo, tiara, armor, fingerless_gloves, holding_sword, looking_at_viewer, simple_background, belt |
| 2 | 5 |  |  |  |  |  | 1girl, blue_gloves, falchion_(fire_emblem), fingerless_gloves, holding_sword, long_sleeves, looking_at_viewer, solo, tiara, white_background, blue_cape, closed_mouth, red_cape, simple_background, blue_footwear, brown_belt, shoulder_armor, sweater, thigh_boots |
| 3 | 7 |  |  |  |  |  | 1girl, bare_shoulders, hair_flower, looking_at_viewer, official_alternate_costume, solo, white_dress, sleeveless_dress, smile, blush, cleavage, closed_mouth, collarbone, symbol-shaped_pupils, armlet, small_breasts, upper_body, white_flower |
| 4 | 5 |  |  |  |  |  | 1girl, crop_top, looking_at_viewer, midriff, navel, short_shorts, solo, tiara, bare_shoulders, blush, official_alternate_costume, small_breasts, open_mouth, simple_background, sleeveless, thighs, white_background, :d, arm_up, armpits, belt, bikini, blue_shorts, innertube |
| 5 | 7 |  |  |  |  |  | 1girl, day, navel, smile, solo, tiara, blue_bikini, crop_top, looking_at_viewer, midriff, outdoors, armpits, bare_shoulders, cloud, ocean, short_shorts, small_breasts, water, alternate_costume, arm_up, beach, belt, blue_sky, blush, closed_mouth, cowboy_shot, innertube, sleeveless, thighs, tree, wet |
| 6 | 7 |  |  |  |  |  | 1girl, large_breasts, smile, solo, alternate_breast_size, looking_at_viewer, navel, tiara, blue_bikini, patreon_username |
| 7 | 19 |  |  |  |  |  | 1girl, looking_at_viewer, solo, alternate_costume, orange_shorts, short_shorts, tiara, waitress, beer_mug, smile, cleavage, employee_uniform, blush, medium_breasts, open_mouth, white_tank_top, holding_plate, chicken_(food), holding_cup, navel, tray |
| 8 | 12 |  |  |  |  |  | 1girl, official_alternate_costume, playboy_bunny, rabbit_ears, smile, solo, fake_animal_ears, looking_at_viewer, white_pantyhose, leotard, rabbit_tail, simple_background, cleavage, open_mouth, blush, small_breasts, easter_egg, frilled_choker, puffy_short_sleeves, white_background, white_gloves |
| 9 | 8 |  |  |  |  |  | 1girl, tiara, uncensored, blush, hetero, nipples, penis, pussy, 1boy, looking_at_viewer, navel, sex, solo_focus, spread_legs, large_breasts, open_mouth, vaginal, completely_nude, cum, lying, alternate_breast_size, clitoris, pov, sweat, tongue_out |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | cape | falchion_(fire_emblem) | fingerless_gloves | solo | sword | tiara | smile | armor | blush | looking_at_viewer | holding_sword | simple_background | belt | blue_gloves | long_sleeves | white_background | blue_cape | closed_mouth | red_cape | blue_footwear | brown_belt | shoulder_armor | sweater | thigh_boots | bare_shoulders | hair_flower | official_alternate_costume | white_dress | sleeveless_dress | cleavage | collarbone | symbol-shaped_pupils | armlet | small_breasts | upper_body | white_flower | crop_top | midriff | navel | short_shorts | open_mouth | sleeveless | thighs | :d | arm_up | armpits | bikini | blue_shorts | innertube | day | blue_bikini | outdoors | cloud | ocean | water | alternate_costume | beach | blue_sky | cowboy_shot | tree | wet | large_breasts | alternate_breast_size | patreon_username | orange_shorts | waitress | beer_mug | employee_uniform | medium_breasts | white_tank_top | holding_plate | chicken_(food) | holding_cup | tray | playboy_bunny | rabbit_ears | fake_animal_ears | white_pantyhose | leotard | rabbit_tail | easter_egg | frilled_choker | puffy_short_sleeves | white_gloves | uncensored | hetero | nipples | penis | pussy | 1boy | sex | solo_focus | spread_legs | vaginal | completely_nude | cum | lying | clitoris | pov | sweat | tongue_out |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:-------------------------|:--------------------|:-------|:--------|:--------|:--------|:--------|:--------|:--------------------|:----------------|:--------------------|:-------|:--------------|:---------------|:-------------------|:------------|:---------------|:-----------|:----------------|:-------------|:-----------------|:----------|:--------------|:-----------------|:--------------|:-----------------------------|:--------------|:-------------------|:-----------|:-------------|:-----------------------|:---------|:----------------|:-------------|:---------------|:-----------|:----------|:--------|:---------------|:-------------|:-------------|:---------|:-----|:---------|:----------|:---------|:--------------|:------------|:------|:--------------|:-----------|:--------|:--------|:--------|:--------------------|:--------|:-----------|:--------------|:-------|:------|:----------------|:------------------------|:-------------------|:----------------|:-----------|:-----------|:-------------------|:-----------------|:-----------------|:----------------|:-----------------|:--------------|:-------|:----------------|:--------------|:-------------------|:------------------|:----------|:--------------|:-------------|:-----------------|:----------------------|:---------------|:-------------|:---------|:----------|:--------|:--------|:-------|:------|:-------------|:--------------|:----------|:------------------|:------|:--------|:-----------|:------|:--------|:-------------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 12 |  |  |  |  |  | X | X | X | X | X | | X | | X | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | | X | X | X | | X | | | | X | X | X | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 7 |  |  |  |  |  | X | | | | X | | | X | | X | X | | | | | | | | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | | | | X | | X | | | X | X | | X | X | | | X | | | | | | | | | X | | X | | | | | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 7 |  |  |  |  |  | X | | | | X | | X | X | | X | X | | | X | | | | | X | | | | | | | X | | | | | | | | | X | | | X | X | X | X | | X | X | | X | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 7 |  |  |  |  |  | X | | | | X | | X | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | X | | | | | | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 19 |  |  |  |  |  | X | | | | X | | X | X | | X | X | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | X | X | X | | | | | | | | | | | | | | | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 12 |  |  |  |  |  | X | | | | X | | | X | | X | X | | X | | | | X | | | | | | | | | | | X | | | X | | | | X | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | |
| 9 | 8 |  |  |  |  |  | X | | | | | | X | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | X | | | | | | | | | | | | | | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/elice_fireemblem | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-17T14:49:18+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-17T16:40:48+00:00 |
73bc6119696b545fe2f9e802d5d5e40e12610e40 |
# Dataset of rea (Fire Emblem)
This is the dataset of rea (Fire Emblem), containing 500 images and their tags.
The core tags of this character are `long_hair, green_hair, green_eyes, breasts, hair_ornament, large_breasts, hair_flower, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 680.40 MiB | [Download](https://huggingface.co/datasets/CyberHarem/rea_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 392.87 MiB | [Download](https://huggingface.co/datasets/CyberHarem/rea_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1143 | 788.72 MiB | [Download](https://huggingface.co/datasets/CyberHarem/rea_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 600.37 MiB | [Download](https://huggingface.co/datasets/CyberHarem/rea_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1143 | 1.06 GiB | [Download](https://huggingface.co/datasets/CyberHarem/rea_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/rea_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, closed_mouth, long_sleeves, simple_background, solo, flower, white_dress, bare_shoulders, smile |
| 1 | 10 |  |  |  |  |  | 1girl, flower, solo, tiara, crown, closed_mouth, simple_background, smile, upper_body, white_background, portrait |
| 2 | 9 |  |  |  |  |  | 1girl, hair_ribbon, pointy_ears, ribbon_braid, side_braid, solo, tiara, twin_braids, closed_mouth, smile, simple_background, looking_at_viewer, upper_body |
| 3 | 8 |  |  |  |  |  | 1girl, barefoot, hair_ribbon, pointy_ears, ribbon_braid, solo, tiara, twin_braids, very_long_hair, blue_dress, anklet, floating_hair, full_body, armpits, side_braid, smile, sparkle, open_mouth |
| 4 | 6 |  |  |  |  |  | 1girl, fur_trim, gift_box, hair_ribbon, pointy_ears, ribbon_braid, solo, tiara, twin_braids, christmas_ornaments, smile, dress, holding, open_mouth, sack, side_braid |
| 5 | 11 |  |  |  |  |  | 1girl, cleavage, closed_mouth, flower, smile, solo, white_bikini, looking_at_viewer, navel, simple_background, white_background |
| 6 | 5 |  |  |  |  |  | 1girl, blue_sky, cleavage, closed_mouth, day, flower, navel, outdoors, white_bikini, official_alternate_costume, smile, beach, solo_focus, water, 1boy, cloud, holding_hands, ocean |
| 7 | 5 |  |  |  |  |  | 1girl, bare_shoulders, beach, blue_sky, blush, cleavage, closed_mouth, collarbone, day, looking_at_viewer, navel, ocean, outdoors, parted_bangs, solo, stomach, thighs, alternate_costume, cowboy_shot, sunlight, black_bikini, cloud, earrings, forehead, sand, thigh_gap, skindentation, smile, umbrella, very_long_hair, water |
| 8 | 6 |  |  |  |  |  | 2girls, cleavage, closed_mouth, flower, navel, thighs, white_bikini, holding, jewelry, legs, sandals, simple_background, smile, full_body, looking_at_viewer, solo_focus, toes, bare_shoulders, circlet, grey_background, official_alternate_costume |
| 9 | 5 |  |  |  |  |  | 1girl, circlet, collarbone, looking_at_viewer, navel, parted_bangs, solo, thighs, white_panties, white_shirt, blush, crop_top, flower, smile, cleavage, parted_lips, bare_shoulders, legs, long_sleeves, lying, off-shoulder_shirt, on_bed, tassel |
| 10 | 12 |  |  |  |  |  | 1girl, bare_shoulders, solo, cleavage, flower, parted_bangs, white_dress, blush, collarbone, looking_at_viewer, smile, circlet, thighs, sitting |
| 11 | 5 |  |  |  |  |  | 1girl, bare_shoulders, blush, crop_top, long_sleeves, midriff, solo, circlet, closed_mouth, collarbone, flower, green_pants, high-waist_pants, looking_at_viewer, navel, parted_bangs, thighs, tight_pants, white_shirt, alternate_costume, cleavage, contemporary, off-shoulder_shirt, smile, yoga_pants, dated, hand_on_hip, simple_background, tassel |
| 12 | 22 |  |  |  |  |  | witch_hat, 1girl, solo, halloween_costume, official_alternate_costume, looking_at_viewer, smile, very_long_hair, long_sleeves, blue_dress, simple_background, wide_sleeves, collarbone, holding, blush, closed_mouth, hat_flower, long_dress |
| 13 | 10 |  |  |  |  |  | blush, completely_nude, 1girl, nipples, open_mouth, penis, uncensored, hetero, pussy, sex, vaginal, 1boy, solo_focus, pointy_ears, sweat, anus, artist_name, ass, english_text, flower, navel, spread_legs |
| 14 | 10 |  |  |  |  |  | 1girl, hetero, blush, flower, solo_focus, fellatio, mosaic_censoring, 1boy, cum, looking_at_viewer, nipples, pubic_hair, gangbang, handjob, huge_breasts, multiple_boys, multiple_penises |
| 15 | 5 |  |  |  |  |  | 1boy, 1girl, bar_censor, blush, flower, hetero, penis, alternate_hair_color, breast_sucking, gloved_handjob, huge_breasts, nursing_handjob, smile, tiara, ejaculation, nipples, short_hair, breastfeeding, closed_eyes, crown, grabbing, lactation, open_mouth |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | closed_mouth | long_sleeves | simple_background | solo | flower | white_dress | bare_shoulders | smile | tiara | crown | upper_body | white_background | portrait | hair_ribbon | pointy_ears | ribbon_braid | side_braid | twin_braids | looking_at_viewer | barefoot | very_long_hair | blue_dress | anklet | floating_hair | full_body | armpits | sparkle | open_mouth | fur_trim | gift_box | christmas_ornaments | dress | holding | sack | cleavage | white_bikini | navel | blue_sky | day | outdoors | official_alternate_costume | beach | solo_focus | water | 1boy | cloud | holding_hands | ocean | blush | collarbone | parted_bangs | stomach | thighs | alternate_costume | cowboy_shot | sunlight | black_bikini | earrings | forehead | sand | thigh_gap | skindentation | umbrella | 2girls | jewelry | legs | sandals | toes | circlet | grey_background | white_panties | white_shirt | crop_top | parted_lips | lying | off-shoulder_shirt | on_bed | tassel | sitting | midriff | green_pants | high-waist_pants | tight_pants | contemporary | yoga_pants | dated | hand_on_hip | witch_hat | halloween_costume | wide_sleeves | hat_flower | long_dress | completely_nude | nipples | penis | uncensored | hetero | pussy | sex | vaginal | sweat | anus | artist_name | ass | english_text | spread_legs | fellatio | mosaic_censoring | cum | pubic_hair | gangbang | handjob | huge_breasts | multiple_boys | multiple_penises | bar_censor | alternate_hair_color | breast_sucking | gloved_handjob | nursing_handjob | ejaculation | short_hair | breastfeeding | closed_eyes | grabbing | lactation |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:---------------|:---------------|:--------------------|:-------|:---------|:--------------|:-----------------|:--------|:--------|:--------|:-------------|:-------------------|:-----------|:--------------|:--------------|:---------------|:-------------|:--------------|:--------------------|:-----------|:-----------------|:-------------|:---------|:----------------|:------------|:----------|:----------|:-------------|:-----------|:-----------|:----------------------|:--------|:----------|:-------|:-----------|:---------------|:--------|:-----------|:------|:-----------|:-----------------------------|:--------|:-------------|:--------|:-------|:--------|:----------------|:--------|:--------|:-------------|:---------------|:----------|:---------|:--------------------|:--------------|:-----------|:---------------|:-----------|:-----------|:-------|:------------|:----------------|:-----------|:---------|:----------|:-------|:----------|:-------|:----------|:------------------|:----------------|:--------------|:-----------|:--------------|:--------|:---------------------|:---------|:---------|:----------|:----------|:--------------|:-------------------|:--------------|:---------------|:-------------|:--------|:--------------|:------------|:--------------------|:---------------|:-------------|:-------------|:------------------|:----------|:--------|:-------------|:---------|:--------|:------|:----------|:--------|:-------|:--------------|:------|:---------------|:--------------|:-----------|:-------------------|:------|:-------------|:-----------|:----------|:---------------|:----------------|:-------------------|:-------------|:-----------------------|:-----------------|:-----------------|:------------------|:--------------|:-------------|:----------------|:--------------|:-----------|:------------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 10 |  |  |  |  |  | X | X | | X | X | X | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 9 |  |  |  |  |  | X | X | | X | X | | | | X | X | | X | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 8 |  |  |  |  |  | X | | | | X | | | | X | X | | | | | X | X | X | X | X | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 6 |  |  |  |  |  | X | | | | X | | | | X | X | | | | | X | X | X | X | X | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 11 |  |  |  |  |  | X | X | | X | X | X | | | X | | | | X | | | | | | | X | | | | | | | | | | | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 5 |  |  |  |  |  | X | X | | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 5 |  |  |  |  |  | X | X | | | X | | | X | X | | | | | | | | | | | X | | X | | | | | | | | | | | | | | X | | X | X | X | X | | X | | X | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 6 |  |  |  |  |  | | X | | X | | X | | X | X | | | | | | | | | | | X | | | | | | X | | | | | | | | X | | X | X | X | | | | X | | X | | | | | | | | | | X | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 9 | 5 |  |  |  |  |  | X | | X | | X | X | | X | X | | | | | | | | | | | X | | | | | | | | | | | | | | | | X | | X | | | | | | | | | | | | X | X | X | | X | | | | | | | | | | | | | X | | | X | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 10 | 12 |  |  |  |  |  | X | | | | X | X | X | X | X | | | | | | | | | | | X | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | X | X | X | | X | | | | | | | | | | | | | | | | X | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 11 | 5 |  |  |  |  |  | X | X | X | X | X | X | | X | X | | | | | | | | | | | X | | | | | | | | | | | | | | | | X | | X | | | | | | | | | | | | X | X | X | | X | X | | | | | | | | | | | | | | | X | | | X | X | | | X | | X | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 12 | 22 |  |  |  |  |  | X | X | X | X | X | | | | X | | | | | | | | | | | X | | X | X | | | | | | | | | | | X | | | | | | | | X | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 13 | 10 |  |  |  |  |  | X | | | | | X | | | | | | | | | | X | | | | | | | | | | | | | X | | | | | | | | | X | | | | | | X | | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | |
| 14 | 10 |  |  |  |  |  | X | | | | | X | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | X | | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | |
| 15 | 5 |  |  |  |  |  | X | | | | | X | | | X | X | X | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | | X | | | | | | | | | | | | | | | | X | | | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/rea_fireemblem | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-17T14:49:27+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-17T16:41:44+00:00 |
b889316a84c1b1aebf7b341b482663dad3e01eee | DiogoAvalos/claudioduarte3 | [
"license:openrail",
"region:us"
] | 2024-01-17T14:52:07+00:00 | {"license": "openrail"} | 2024-01-17T14:53:40+00:00 |
|
31e9070c6ec3d8f295659a01f003f2a54fcb3cf2 | # Dataset Card for "cult-dpo"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | mii-llm/cult-dpo | [
"region:us"
] | 2024-01-17T14:59:01+00:00 | {"dataset_info": {"features": [{"name": "prompt", "dtype": "string"}, {"name": "chosen", "dtype": "string"}, {"name": "rejected", "dtype": "string"}, {"name": "system", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 2153488, "num_examples": 987}], "download_size": 1208959, "dataset_size": 2153488}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-17T14:59:06+00:00 |
b27d8e99aa573dae1a9f656ed85c4989a0078114 |
# Pseudostreaming Malaya-Speech STT
Original dataset at https://github.com/mesolitica/malaysian-dataset/tree/master/speech-to-text-semisupervised/pseudolabel-malaya-speech-stt
We use https://huggingface.co/mesolitica/conformer-medium-mixed to generate pseudostreaming dataset, source code at https://github.com/mesolitica/malaysian-dataset/tree/master/speech-to-text-semisupervised/pseudostreaming-malaya-speech-stt
Total 8667.802379812754 hours.
data format from [processed.jsonl](processed.jsonl),
```json
[
{
"text": "pernahkah",
"start": 0.2802439024390244,
"end": 0.9005226480836237,
"audio_filename": "processed-audio/0-321061-0.mp3",
"original_audio_filename": "output-audio/0-10033-5.mp3"
},
{
"text": "pernahkah anda",
"start": 0.2802439024390244,
"end": 1.1407317073170733,
"audio_filename": "processed-audio/0-321061-1.mp3",
"original_audio_filename": "output-audio/0-10033-5.mp3"
},
{
"text": "pernahkah anda menga",
"start": 0.2802439024390244,
"end": 1.5410801393728224,
"audio_filename": "processed-audio/0-321061-2.mp3",
"original_audio_filename": "output-audio/0-10033-5.mp3"
},
{
"text": "pernahkah anda mengala",
"start": 0.2802439024390244,
"end": 1.741254355400697,
"audio_filename": "processed-audio/0-321061-3.mp3",
"original_audio_filename": "output-audio/0-10033-5.mp3"
},
{
"text": "pernahkah anda mengalami",
"start": 0.2802439024390244,
"end": 1.8613588850174216,
"audio_filename": "processed-audio/0-321061-4.mp3",
"original_audio_filename": "output-audio/0-10033-5.mp3"
},
{
"text": "pernahkah anda mengalami situ",
"start": 0.2802439024390244,
"end": 2.061533101045296,
"audio_filename": "processed-audio/0-321061-5.mp3",
"original_audio_filename": "output-audio/0-10033-5.mp3"
},
{
"text": "pernahkah anda mengalami situa",
"start": 0.2802439024390244,
"end": 2.3017421602787453,
"audio_filename": "processed-audio/0-321061-6.mp3",
"original_audio_filename": "output-audio/0-10033-5.mp3"
},
{
"text": "pernahkah anda mengalami situasi",
"start": 0.2802439024390244,
"end": 2.3818118466898954,
"audio_filename": "processed-audio/0-321061-7.mp3",
"original_audio_filename": "output-audio/0-10033-5.mp3"
},
{
"text": "pernahkah anda mengalami situasi di",
"start": 0.2802439024390244,
"end": 2.541951219512195,
"audio_filename": "processed-audio/0-321061-8.mp3",
"original_audio_filename": "output-audio/0-10033-5.mp3"
},
{
"text": "pernahkah anda mengalami situasi di mana",
"start": 0.2802439024390244,
"end": 2.702090592334495,
"audio_filename": "processed-audio/0-321061-9.mp3",
"original_audio_filename": "output-audio/0-10033-5.mp3"
},
{
"text": "pernahkah anda mengalami situasi di mana su",
"start": 0.2802439024390244,
"end": 2.9823344947735193,
"audio_filename": "processed-audio/0-321061-10.mp3",
"original_audio_filename": "output-audio/0-10033-5.mp3"
},
{
"text": "pernahkah anda mengalami situasi di mana sub",
"start": 0.2802439024390244,
"end": 3.102439024390244,
"audio_filename": "processed-audio/0-321061-11.mp3",
"original_audio_filename": "output-audio/0-10033-5.mp3"
},
{
"text": "pernahkah anda mengalami situasi di mana subj",
"start": 0.2802439024390244,
"end": 3.182508710801394,
"audio_filename": "processed-audio/0-321061-12.mp3",
"original_audio_filename": "output-audio/0-10033-5.mp3"
},
{
"text": "pernahkah anda mengalami situasi di mana subje",
"start": 0.2802439024390244,
"end": 3.3026132404181183,
"audio_filename": "processed-audio/0-321061-13.mp3",
"original_audio_filename": "output-audio/0-10033-5.mp3"
},
{
"text": "pernahkah anda mengalami situasi di mana subjek",
"start": 0.2802439024390244,
"end": 3.3426480836236934,
"audio_filename": "processed-audio/0-321061-14.mp3",
"original_audio_filename": "output-audio/0-10033-5.mp3"
},
{
"text": "pernahkah anda mengalami situasi di mana subjek ter",
"start": 0.2802439024390244,
"end": 3.462752613240418,
"audio_filename": "processed-audio/0-321061-15.mp3",
"original_audio_filename": "output-audio/0-10033-5.mp3"
},
{
"text": "pernahkah anda mengalami situasi di mana subjek terke",
"start": 0.2802439024390244,
"end": 3.622891986062718,
"audio_filename": "processed-audio/0-321061-16.mp3",
"original_audio_filename": "output-audio/0-10033-5.mp3"
},
{
"text": "pernahkah anda mengalami situasi di mana subjek terkelu",
"start": 0.2802439024390244,
"end": 3.7830313588850175,
"audio_filename": "processed-audio/0-321061-17.mp3",
"original_audio_filename": "output-audio/0-10033-5.mp3"
},
{
"text": "pernahkah anda mengalami situasi di mana subjek terkeluar",
"start": 0.2802439024390244,
"end": 3.863101045296167,
"audio_filename": "processed-audio/0-321061-18.mp3",
"original_audio_filename": "output-audio/0-10033-5.mp3"
},
{
"text": "pernahkah anda mengalami situasi di mana subjek terkeluar daripada",
"start": 0.2802439024390244,
"end": 3.9832055749128923,
"audio_filename": "processed-audio/0-321061-19.mp3",
"original_audio_filename": "output-audio/0-10033-5.mp3"
},
{
"text": "pernahkah anda mengalami situasi di mana subjek terkeluar daripada bi",
"start": 0.2802439024390244,
"end": 4.463623693379791,
"audio_filename": "processed-audio/0-321061-20.mp3",
"original_audio_filename": "output-audio/0-10033-5.mp3"
},
{
"text": "pernahkah anda mengalami situasi di mana subjek terkeluar daripada bing",
"start": 0.2802439024390244,
"end": 4.62376306620209,
"audio_filename": "processed-audio/0-321061-21.mp3",
"original_audio_filename": "output-audio/0-10033-5.mp3"
},
{
"text": "pernahkah anda mengalami situasi di mana subjek terkeluar daripada bingka",
"start": 0.2802439024390244,
"end": 4.663797909407666,
"audio_filename": "processed-audio/0-321061-22.mp3",
"original_audio_filename": "output-audio/0-10033-5.mp3"
},
{
"text": "pernahkah anda mengalami situasi di mana subjek terkeluar daripada bingkai",
"start": 0.2802439024390244,
"end": 4.7438675958188155,
"audio_filename": "processed-audio/0-321061-23.mp3",
"original_audio_filename": "output-audio/0-10033-5.mp3"
},
{
"text": "pernahkah anda mengalami situasi di mana subjek terkeluar daripada bingkai gambar",
"start": 0.2802439024390244,
"end": 4.863972125435541,
"audio_filename": "processed-audio/0-321061-24.mp3",
"original_audio_filename": "output-audio/0-10033-5.mp3"
},
{
"text": "pernahkah anda mengalami situasi di mana subjek terkeluar daripada bingkai gambar",
"start": 0.2802439024390244,
"end": 4.863972125435541,
"audio_filename": "processed-audio/0-321061-25.mp3",
"original_audio_filename": "output-audio/0-10033-5.mp3"
}
]
```
## how-to
```bash
git clone https://huggingface.co/datasets/mesolitica/pseudostreaming-malaya-speech-stt
cd pseudostreaming-malaya-speech-stt
wget https://www.7-zip.org/a/7z2301-linux-x64.tar.xz
tar -xf 7z2301-linux-x64.tar.xz
./7zz x processed-audio.7z.001 -y -mmt40
``` | mesolitica/pseudostreaming-malaya-speech-stt | [
"task_categories:automatic-speech-recognition",
"language:ms",
"license:mit",
"region:us"
] | 2024-01-17T14:59:47+00:00 | {"language": ["ms"], "license": "mit", "task_categories": ["automatic-speech-recognition"]} | 2024-02-12T07:57:36+00:00 |
0a23b0cc45605f6751ef88e0e182a570ddecfc7c | CultriX/MsitralTrix-test-dpo | [
"task_categories:question-answering",
"size_categories:n<1K",
"language:en",
"license:apache-2.0",
"chemistry",
"biology",
"dpo",
"medical",
"region:us"
] | 2024-01-17T15:00:22+00:00 | {"language": ["en"], "license": "apache-2.0", "size_categories": ["n<1K"], "task_categories": ["question-answering"], "pretty_name": "MistralTrix-test-dpo", "tags": ["chemistry", "biology", "dpo", "medical"]} | 2024-01-17T15:00:54+00:00 |
|
25d6c5ac149aa698bf38f7f862e8d085eb18d4a9 | kenken6696/FOLIO_by_LET | [
"license:cc-by-4.0",
"region:us"
] | 2024-01-17T15:00:44+00:00 | {"license": "cc-by-4.0", "dataset_info": {"features": [{"name": "example_id", "dtype": "int64"}, {"name": "conclusion", "dtype": "string"}, {"name": "premises", "sequence": "string"}, {"name": "label", "dtype": "string"}, {"name": "LET_count", "dtype": "int64"}, {"name": "LEC_types", "sequence": "int64"}], "splits": [{"name": "train", "num_bytes": 2337894, "num_examples": 4651}], "download_size": 139624, "dataset_size": 2337894}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-17T15:00:59+00:00 |
|
25f3fb413936f65622851b69b1a128fa5f7832d8 |
# Dataset of anna (Fire Emblem)
This is the dataset of anna (Fire Emblem), containing 353 images and their tags.
The core tags of this character are `red_hair, ponytail, breasts, red_eyes, long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 353 | 399.09 MiB | [Download](https://huggingface.co/datasets/CyberHarem/anna_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 353 | 225.12 MiB | [Download](https://huggingface.co/datasets/CyberHarem/anna_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 841 | 474.40 MiB | [Download](https://huggingface.co/datasets/CyberHarem/anna_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 353 | 351.09 MiB | [Download](https://huggingface.co/datasets/CyberHarem/anna_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 841 | 663.62 MiB | [Download](https://huggingface.co/datasets/CyberHarem/anna_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/anna_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 17 |  |  |  |  |  | 1girl, solo, looking_at_viewer, smile, simple_background, white_background, blush, cape, gloves, one_eye_closed, open_mouth, upper_body |
| 1 | 15 |  |  |  |  |  | 1boy, 1girl, hetero, nipples, penis, blush, solo_focus, smile, open_mouth, cowgirl_position, cum_on_body, girl_on_top, mosaic_censoring, navel, sex, vaginal, completely_nude, pov, uncensored |
| 2 | 9 |  |  |  |  |  | 1girl, nipples, solo, uncensored, completely_nude, erection, huge_penis, large_penis, blush, large_breasts, navel, futanari_masturbation, open_mouth, veins, artist_name, ejaculation, large_testicles |
| 3 | 10 |  |  |  |  |  | 1girl, bare_shoulders, hair_flower, white_dress, smile, solo, looking_at_viewer, simple_background, bangs, detached_sleeves, bride, full_body, holding, jewelry, official_alternate_costume, wedding_dress, choker, rose, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | smile | simple_background | white_background | blush | cape | gloves | one_eye_closed | open_mouth | upper_body | 1boy | hetero | nipples | penis | solo_focus | cowgirl_position | cum_on_body | girl_on_top | mosaic_censoring | navel | sex | vaginal | completely_nude | pov | uncensored | erection | huge_penis | large_penis | large_breasts | futanari_masturbation | veins | artist_name | ejaculation | large_testicles | bare_shoulders | hair_flower | white_dress | bangs | detached_sleeves | bride | full_body | holding | jewelry | official_alternate_costume | wedding_dress | choker | rose |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:--------|:--------------------|:-------------------|:--------|:-------|:---------|:-----------------|:-------------|:-------------|:-------|:---------|:----------|:--------|:-------------|:-------------------|:--------------|:--------------|:-------------------|:--------|:------|:----------|:------------------|:------|:-------------|:-----------|:-------------|:--------------|:----------------|:------------------------|:--------|:--------------|:--------------|:------------------|:-----------------|:--------------|:--------------|:--------|:-------------------|:--------|:------------|:----------|:----------|:-----------------------------|:----------------|:---------|:-------|
| 0 | 17 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 15 |  |  |  |  |  | X | | | X | | | X | | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 9 |  |  |  |  |  | X | X | | | | | X | | | | X | | | | X | | | | | | | X | | | X | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | |
| 3 | 10 |  |  |  |  |  | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/anna_fireemblem | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-17T15:03:26+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-17T16:41:03+00:00 |
8a39f50394f20aba10ad59c0ac1fbae246176025 |
# Dataset of furen (Fire Emblem)
This is the dataset of furen (Fire Emblem), containing 466 images and their tags.
The core tags of this character are `green_hair, long_hair, green_eyes, hair_ornament, drill_hair, bow`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 466 | 497.75 MiB | [Download](https://huggingface.co/datasets/CyberHarem/furen_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 466 | 304.16 MiB | [Download](https://huggingface.co/datasets/CyberHarem/furen_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 977 | 605.18 MiB | [Download](https://huggingface.co/datasets/CyberHarem/furen_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 466 | 451.06 MiB | [Download](https://huggingface.co/datasets/CyberHarem/furen_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 977 | 832.61 MiB | [Download](https://huggingface.co/datasets/CyberHarem/furen_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/furen_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, closed_mouth, full_body, garreg_mach_monastery_uniform, long_sleeves, solo, black_footwear, simple_background, smile, white_background, knee_boots, pantyhose, black_dress |
| 1 | 11 |  |  |  |  |  | 1girl, closed_mouth, garreg_mach_monastery_uniform, smile, solo, long_sleeves, upper_body, simple_background, hairclip, white_background |
| 2 | 27 |  |  |  |  |  | 1girl, garreg_mach_monastery_uniform, solo, long_sleeves, open_mouth, upper_body, simple_background, white_background |
| 3 | 5 |  |  |  |  |  | 1girl, bell, cat_tail, dress, solo, alternate_costume, long_sleeves, tail_ornament, white_gloves, cat_ears, open_mouth, halloween_costume, holding, paw_gloves, paw_pose, paw_print, smile |
| 4 | 5 |  |  |  |  |  | 1boy, 1girl, blush, hetero, mosaic_censoring, solo_focus, looking_at_viewer, hairclip, open_mouth, pov, cum, handjob, licking_penis, tongue_out |
| 5 | 12 |  |  |  |  |  | 1girl, 1boy, hetero, open_mouth, vaginal, blush, penis, sex, breasts, solo_focus, cum_in_pussy, nipples, censored, spread_legs, completely_nude, sweat |
| 6 | 9 |  |  |  |  |  | 1girl, nipples, completely_nude, navel, solo, blush, pussy, looking_at_viewer, closed_mouth, small_breasts |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | closed_mouth | full_body | garreg_mach_monastery_uniform | long_sleeves | solo | black_footwear | simple_background | smile | white_background | knee_boots | pantyhose | black_dress | upper_body | hairclip | open_mouth | bell | cat_tail | dress | alternate_costume | tail_ornament | white_gloves | cat_ears | halloween_costume | holding | paw_gloves | paw_pose | paw_print | 1boy | blush | hetero | mosaic_censoring | solo_focus | looking_at_viewer | pov | cum | handjob | licking_penis | tongue_out | vaginal | penis | sex | breasts | cum_in_pussy | nipples | censored | spread_legs | completely_nude | sweat | navel | pussy | small_breasts |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:------------|:--------------------------------|:---------------|:-------|:-----------------|:--------------------|:--------|:-------------------|:-------------|:------------|:--------------|:-------------|:-----------|:-------------|:-------|:-----------|:--------|:--------------------|:----------------|:---------------|:-----------|:--------------------|:----------|:-------------|:-----------|:------------|:-------|:--------|:---------|:-------------------|:-------------|:--------------------|:------|:------|:----------|:----------------|:-------------|:----------|:--------|:------|:----------|:---------------|:----------|:-----------|:--------------|:------------------|:--------|:--------|:--------|:----------------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 11 |  |  |  |  |  | X | X | | X | X | X | | X | X | X | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 27 |  |  |  |  |  | X | | | X | X | X | | X | | X | | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | | | | X | X | | | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | | | | | | | | | | | | | | X | X | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | |
| 5 | 12 |  |  |  |  |  | X | | | | | | | | | | | | | | | X | | | | | | | | | | | | | X | X | X | | X | | | | | | | X | X | X | X | X | X | X | X | X | X | | | |
| 6 | 9 |  |  |  |  |  | X | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | X | | | | X | | | | | | | | | | | X | | | X | | X | X | X |
| CyberHarem/furen_fireemblem | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-17T15:03:28+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-17T16:31:57+00:00 |
d1dc27221cf2f1e8ac2360b46a5dd6482ca74b35 |
# Dataset of tiamo (Fire Emblem)
This is the dataset of tiamo (Fire Emblem), containing 449 images and their tags.
The core tags of this character are `long_hair, red_hair, red_eyes, breasts, hair_ornament, hair_between_eyes, very_long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 449 | 532.08 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tiamo_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 449 | 324.59 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tiamo_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1004 | 622.68 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tiamo_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 449 | 482.61 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tiamo_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1004 | 835.50 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tiamo_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/tiamo_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 7 |  |  |  |  |  | 1girl, blush, looking_at_viewer, navel, nipples, smile, solo, large_breasts, completely_nude, female_pubic_hair, pussy |
| 1 | 6 |  |  |  |  |  | 1girl, blush, navel, nipples, solo, thighhighs, looking_at_viewer, pussy, elbow_gloves, medium_breasts, nude, small_breasts, smile |
| 2 | 11 |  |  |  |  |  | 1boy, 1girl, blush, completely_nude, hetero, nipples, sex, solo_focus, vaginal, mosaic_censoring, navel, open_mouth, penis, pussy, spread_legs, small_breasts, medium_breasts, sweat, missionary, on_back, pov, looking_at_viewer, pillow |
| 3 | 5 |  |  |  |  |  | 1boy, 1girl, anus, blush, completely_nude, hetero, looking_at_viewer, looking_back, mosaic_censoring, open_mouth, penis, pussy, solo_focus, vaginal, medium_breasts, nipples, sex_from_behind, ass_grab, girl_on_top, reverse_cowgirl_position, indoors, spread_legs, sweat, wing_hair_ornament |
| 4 | 10 |  |  |  |  |  | 1girl, garter_straps, gauntlets, solo, thighhighs, thigh_boots, spear, breastplate, looking_at_viewer, belt |
| 5 | 6 |  |  |  |  |  | 1girl, garter_straps, holding_weapon, solo, thigh_boots, thighhighs, breastplate, feathers, gloves, looking_at_viewer, red_dress, short_dress, smile, spear, wing_hair_ornament, gauntlets, shoulder_armor, zettai_ryouiki |
| 6 | 9 |  |  |  |  |  | 1girl, gauntlets, smile, solo, looking_at_viewer, polearm, breastplate, holding_weapon, simple_background, white_background |
| 7 | 5 |  |  |  |  |  | 1girl, full_body, holding_bow_(weapon), solo, wedding_dress, white_dress, high_heels, looking_at_viewer, bride, gloves, simple_background, smile, bare_shoulders, bridal_gauntlets, grey_background, holding_arrow, one_eye_closed, open_mouth, white_background |
| 8 | 9 |  |  |  |  |  | wedding_dress, 1girl, bare_shoulders, blush, looking_at_viewer, smile, solo, white_dress, bride, pearl_necklace |
| 9 | 9 |  |  |  |  |  | 1girl, solo, bare_shoulders, holding_weapon, looking_at_viewer, navel, red_bikini, fingerless_gloves, collarbone, fish, smile, bangs, bikini_skirt, cleavage, full_body, simple_background, spear, blush, sandals, small_breasts, toeless_footwear |
| 10 | 14 |  |  |  |  |  | 1girl, looking_at_viewer, solo, red_bikini, navel, smile, blush, sky, small_breasts, upper_body, collarbone, bare_shoulders, cloud, day, bangs, open_mouth, outdoors, wing_hair_ornament |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blush | looking_at_viewer | navel | nipples | smile | solo | large_breasts | completely_nude | female_pubic_hair | pussy | thighhighs | elbow_gloves | medium_breasts | nude | small_breasts | 1boy | hetero | sex | solo_focus | vaginal | mosaic_censoring | open_mouth | penis | spread_legs | sweat | missionary | on_back | pov | pillow | anus | looking_back | sex_from_behind | ass_grab | girl_on_top | reverse_cowgirl_position | indoors | wing_hair_ornament | garter_straps | gauntlets | thigh_boots | spear | breastplate | belt | holding_weapon | feathers | gloves | red_dress | short_dress | shoulder_armor | zettai_ryouiki | polearm | simple_background | white_background | full_body | holding_bow_(weapon) | wedding_dress | white_dress | high_heels | bride | bare_shoulders | bridal_gauntlets | grey_background | holding_arrow | one_eye_closed | pearl_necklace | red_bikini | fingerless_gloves | collarbone | fish | bangs | bikini_skirt | cleavage | sandals | toeless_footwear | sky | upper_body | cloud | day | outdoors |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:--------|:--------------------|:--------|:----------|:--------|:-------|:----------------|:------------------|:--------------------|:--------|:-------------|:---------------|:-----------------|:-------|:----------------|:-------|:---------|:------|:-------------|:----------|:-------------------|:-------------|:--------|:--------------|:--------|:-------------|:----------|:------|:---------|:-------|:---------------|:------------------|:-----------|:--------------|:---------------------------|:----------|:---------------------|:----------------|:------------|:--------------|:--------|:--------------|:-------|:-----------------|:-----------|:---------|:------------|:--------------|:-----------------|:-----------------|:----------|:--------------------|:-------------------|:------------|:-----------------------|:----------------|:--------------|:-------------|:--------|:-----------------|:-------------------|:------------------|:----------------|:-----------------|:-----------------|:-------------|:--------------------|:-------------|:-------|:--------|:---------------|:-----------|:----------|:-------------------|:------|:-------------|:--------|:------|:-----------|
| 0 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 11 |  |  |  |  |  | X | X | X | X | X | | | | X | | X | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | X | X | | X | | | | X | | X | | | X | | | X | X | | X | X | X | X | X | X | X | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 10 |  |  |  |  |  | X | | X | | | | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 6 |  |  |  |  |  | X | | X | | | X | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 9 |  |  |  |  |  | X | | X | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | X | | X | | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 5 |  |  |  |  |  | X | | X | | | X | X | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | |
| 8 | 9 |  |  |  |  |  | X | X | X | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | | X | X | | | | | X | | | | | | | | | | | | | | |
| 9 | 9 |  |  |  |  |  | X | X | X | X | | X | X | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | X | | | | | | | | X | | X | | | | | | X | | | | | | X | X | X | X | X | X | X | X | X | | | | | |
| 10 | 14 |  |  |  |  |  | X | X | X | X | | X | X | | | | | | | | | X | | | | | | | X | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | X | | X | | X | | | | | X | X | X | X | X |
| CyberHarem/tiamo_fireemblem | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-17T15:03:39+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-17T16:43:50+00:00 |
a01aca7c9c451376845f00f5bec267cd504c493f | Yevhenii1234/test | [
"license:apache-2.0",
"region:us"
] | 2024-01-17T15:09:40+00:00 | {"license": "apache-2.0"} | 2024-01-17T15:09:40+00:00 |
|
9c87532c584b68abb25361927af33cf16dbb010a |
# LVIS
### Dataset Summary
This dataset is the implementation of LVIS dataset into Hugging Face datasets. Please visit the original website for more information.
- https://www.lvisdataset.org/
### Loading
This code returns train, validation and test generators.
```python
from datasets import load_dataset
dataset = load_dataset("winvoker/lvis")
```
Objects is a dictionary which contains annotation information like bbox, class.
```
DatasetDict({
train: Dataset({
features: ['id', 'image', 'height', 'width', 'objects'],
num_rows: 100170
})
validation: Dataset({
features: ['id', 'image', 'height', 'width', 'objects'],
num_rows: 4809
})
test: Dataset({
features: ['id', 'image', 'height', 'width', 'objects'],
num_rows: 19822
})
})
```
### Access Generators
```python
train = dataset["train"]
validation = dataset["validation"]
test = dataset["test"]
```
An example row is as follows.
```json
{ 'id': 0,
'image': '000000437561.jpg',
'height': 480,
'width': 640,
'objects': {
'bboxes': [[[392, 271, 14, 3]],
'classes': [117],
'segmentation': [[376, 272, 375, 270, 372, 269, 371, 269, 373, 269, 373]]
}
}
``` | anvilarth/lvis | [
"language:en",
"license:apache-2.0",
"region:us"
] | 2024-01-17T15:10:32+00:00 | {"language": ["en"], "license": "apache-2.0"} | 2024-01-17T15:20:57+00:00 |
1212b5af147046faca7556244cc593b2bf7d4217 | matekadlicsko/hungarian-news-translations | [
"task_categories:translation",
"size_categories:10K<n<100K",
"language:hu",
"language:en",
"license:apache-2.0",
"region:us"
] | 2024-01-17T15:11:16+00:00 | {"language": ["hu", "en"], "license": "apache-2.0", "size_categories": ["10K<n<100K"], "task_categories": ["translation"], "dataset_info": {"features": [{"name": "en", "dtype": "string"}, {"name": "hu", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 6957683.788345884, "num_examples": 14840}, {"name": "test", "num_bytes": 2319384.2116541164, "num_examples": 4947}], "download_size": 6204740, "dataset_size": 9277068.0}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2024-02-15T16:02:30+00:00 |
|
b20883557e0f1e05be6480dfa1e7be3bcbdb18db |
# Dataset Card for Evaluation run of KnutJaegersberg/Deita-Qwen-1_8B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [KnutJaegersberg/Deita-Qwen-1_8B](https://huggingface.co/KnutJaegersberg/Deita-Qwen-1_8B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_KnutJaegersberg__Deita-Qwen-1_8B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-17T15:12:59.171599](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__Deita-Qwen-1_8B/blob/main/results_2024-01-17T15-12-59.171599.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4513695685616451,
"acc_stderr": 0.03473174413713779,
"acc_norm": 0.4572369573723266,
"acc_norm_stderr": 0.035511285827617124,
"mc1": 0.2521419828641371,
"mc1_stderr": 0.015201522246299979,
"mc2": 0.4002214148044727,
"mc2_stderr": 0.014908452990717655
},
"harness|arc:challenge|25": {
"acc": 0.32081911262798635,
"acc_stderr": 0.013640943091946522,
"acc_norm": 0.3651877133105802,
"acc_norm_stderr": 0.014070265519268802
},
"harness|hellaswag|10": {
"acc": 0.4574785899223262,
"acc_stderr": 0.004971704917267752,
"acc_norm": 0.6062537343158734,
"acc_norm_stderr": 0.004875812021461993
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.42962962962962964,
"acc_stderr": 0.04276349494376599,
"acc_norm": 0.42962962962962964,
"acc_norm_stderr": 0.04276349494376599
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5131578947368421,
"acc_stderr": 0.04067533136309173,
"acc_norm": 0.5131578947368421,
"acc_norm_stderr": 0.04067533136309173
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4830188679245283,
"acc_stderr": 0.030755120364119898,
"acc_norm": 0.4830188679245283,
"acc_norm_stderr": 0.030755120364119898
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.375,
"acc_stderr": 0.04048439222695598,
"acc_norm": 0.375,
"acc_norm_stderr": 0.04048439222695598
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4393063583815029,
"acc_stderr": 0.03784271932887467,
"acc_norm": 0.4393063583815029,
"acc_norm_stderr": 0.03784271932887467
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.04724007352383889,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.04724007352383889
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3829787234042553,
"acc_stderr": 0.03177821250236922,
"acc_norm": 0.3829787234042553,
"acc_norm_stderr": 0.03177821250236922
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.04303684033537315,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.04303684033537315
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.42758620689655175,
"acc_stderr": 0.041227371113703316,
"acc_norm": 0.42758620689655175,
"acc_norm_stderr": 0.041227371113703316
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.023919984164047732,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.023919984164047732
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04285714285714281,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04285714285714281
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5225806451612903,
"acc_stderr": 0.02841498501970786,
"acc_norm": 0.5225806451612903,
"acc_norm_stderr": 0.02841498501970786
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4088669950738916,
"acc_stderr": 0.034590588158832314,
"acc_norm": 0.4088669950738916,
"acc_norm_stderr": 0.034590588158832314
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5333333333333333,
"acc_stderr": 0.038956580652718446,
"acc_norm": 0.5333333333333333,
"acc_norm_stderr": 0.038956580652718446
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5303030303030303,
"acc_stderr": 0.03555804051763929,
"acc_norm": 0.5303030303030303,
"acc_norm_stderr": 0.03555804051763929
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6010362694300518,
"acc_stderr": 0.03533999094065696,
"acc_norm": 0.6010362694300518,
"acc_norm_stderr": 0.03533999094065696
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.43333333333333335,
"acc_stderr": 0.02512465352588513,
"acc_norm": 0.43333333333333335,
"acc_norm_stderr": 0.02512465352588513
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25555555555555554,
"acc_stderr": 0.02659393910184405,
"acc_norm": 0.25555555555555554,
"acc_norm_stderr": 0.02659393910184405
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4495798319327731,
"acc_stderr": 0.03231293497137707,
"acc_norm": 0.4495798319327731,
"acc_norm_stderr": 0.03231293497137707
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.0386155754625517,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.0386155754625517
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5614678899082569,
"acc_stderr": 0.021274713073954572,
"acc_norm": 0.5614678899082569,
"acc_norm_stderr": 0.021274713073954572
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4398148148148148,
"acc_stderr": 0.03385177976044811,
"acc_norm": 0.4398148148148148,
"acc_norm_stderr": 0.03385177976044811
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5147058823529411,
"acc_stderr": 0.03507793834791324,
"acc_norm": 0.5147058823529411,
"acc_norm_stderr": 0.03507793834791324
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6033755274261603,
"acc_stderr": 0.03184399873811224,
"acc_norm": 0.6033755274261603,
"acc_norm_stderr": 0.03184399873811224
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.4977578475336323,
"acc_stderr": 0.033557465352232634,
"acc_norm": 0.4977578475336323,
"acc_norm_stderr": 0.033557465352232634
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5114503816793893,
"acc_stderr": 0.043841400240780176,
"acc_norm": 0.5114503816793893,
"acc_norm_stderr": 0.043841400240780176
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6033057851239669,
"acc_stderr": 0.044658697805310094,
"acc_norm": 0.6033057851239669,
"acc_norm_stderr": 0.044658697805310094
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.048262172941398944,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.048262172941398944
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.44171779141104295,
"acc_stderr": 0.03901591825836184,
"acc_norm": 0.44171779141104295,
"acc_norm_stderr": 0.03901591825836184
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.26785714285714285,
"acc_stderr": 0.04203277291467762,
"acc_norm": 0.26785714285714285,
"acc_norm_stderr": 0.04203277291467762
},
"harness|hendrycksTest-management|5": {
"acc": 0.6699029126213593,
"acc_stderr": 0.046561471100123514,
"acc_norm": 0.6699029126213593,
"acc_norm_stderr": 0.046561471100123514
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6709401709401709,
"acc_stderr": 0.03078232157768817,
"acc_norm": 0.6709401709401709,
"acc_norm_stderr": 0.03078232157768817
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5504469987228607,
"acc_stderr": 0.017788725283507337,
"acc_norm": 0.5504469987228607,
"acc_norm_stderr": 0.017788725283507337
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.48265895953757226,
"acc_stderr": 0.026902900458666647,
"acc_norm": 0.48265895953757226,
"acc_norm_stderr": 0.026902900458666647
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24804469273743016,
"acc_stderr": 0.01444415780826144,
"acc_norm": 0.24804469273743016,
"acc_norm_stderr": 0.01444415780826144
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5620915032679739,
"acc_stderr": 0.02840830202033269,
"acc_norm": 0.5620915032679739,
"acc_norm_stderr": 0.02840830202033269
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5048231511254019,
"acc_stderr": 0.028396770444111298,
"acc_norm": 0.5048231511254019,
"acc_norm_stderr": 0.028396770444111298
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.45987654320987653,
"acc_stderr": 0.02773102275353928,
"acc_norm": 0.45987654320987653,
"acc_norm_stderr": 0.02773102275353928
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.37943262411347517,
"acc_stderr": 0.028947338851614105,
"acc_norm": 0.37943262411347517,
"acc_norm_stderr": 0.028947338851614105
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3324641460234681,
"acc_stderr": 0.012032022332260507,
"acc_norm": 0.3324641460234681,
"acc_norm_stderr": 0.012032022332260507
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4007352941176471,
"acc_stderr": 0.029768263528933102,
"acc_norm": 0.4007352941176471,
"acc_norm_stderr": 0.029768263528933102
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.019835176484375376,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.019835176484375376
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5363636363636364,
"acc_stderr": 0.04776449162396197,
"acc_norm": 0.5363636363636364,
"acc_norm_stderr": 0.04776449162396197
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5387755102040817,
"acc_stderr": 0.031912820526692774,
"acc_norm": 0.5387755102040817,
"acc_norm_stderr": 0.031912820526692774
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.582089552238806,
"acc_stderr": 0.034875586404620636,
"acc_norm": 0.582089552238806,
"acc_norm_stderr": 0.034875586404620636
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-virology|5": {
"acc": 0.40963855421686746,
"acc_stderr": 0.03828401115079023,
"acc_norm": 0.40963855421686746,
"acc_norm_stderr": 0.03828401115079023
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5497076023391813,
"acc_stderr": 0.038158273659132366,
"acc_norm": 0.5497076023391813,
"acc_norm_stderr": 0.038158273659132366
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2521419828641371,
"mc1_stderr": 0.015201522246299979,
"mc2": 0.4002214148044727,
"mc2_stderr": 0.014908452990717655
},
"harness|winogrande|5": {
"acc": 0.5935280189423836,
"acc_stderr": 0.013804448697753376
},
"harness|gsm8k|5": {
"acc": 0.1561789234268385,
"acc_stderr": 0.00999950936975745
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_KnutJaegersberg__Deita-Qwen-1_8B | [
"region:us"
] | 2024-01-17T15:15:09+00:00 | {"pretty_name": "Evaluation run of KnutJaegersberg/Deita-Qwen-1_8B", "dataset_summary": "Dataset automatically created during the evaluation run of model [KnutJaegersberg/Deita-Qwen-1_8B](https://huggingface.co/KnutJaegersberg/Deita-Qwen-1_8B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_KnutJaegersberg__Deita-Qwen-1_8B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-17T15:12:59.171599](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__Deita-Qwen-1_8B/blob/main/results_2024-01-17T15-12-59.171599.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4513695685616451,\n \"acc_stderr\": 0.03473174413713779,\n \"acc_norm\": 0.4572369573723266,\n \"acc_norm_stderr\": 0.035511285827617124,\n \"mc1\": 0.2521419828641371,\n \"mc1_stderr\": 0.015201522246299979,\n \"mc2\": 0.4002214148044727,\n \"mc2_stderr\": 0.014908452990717655\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.32081911262798635,\n \"acc_stderr\": 0.013640943091946522,\n \"acc_norm\": 0.3651877133105802,\n \"acc_norm_stderr\": 0.014070265519268802\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4574785899223262,\n \"acc_stderr\": 0.004971704917267752,\n \"acc_norm\": 0.6062537343158734,\n \"acc_norm_stderr\": 0.004875812021461993\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.42962962962962964,\n \"acc_stderr\": 0.04276349494376599,\n \"acc_norm\": 0.42962962962962964,\n \"acc_norm_stderr\": 0.04276349494376599\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5131578947368421,\n \"acc_stderr\": 0.04067533136309173,\n \"acc_norm\": 0.5131578947368421,\n \"acc_norm_stderr\": 0.04067533136309173\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.4830188679245283,\n \"acc_stderr\": 0.030755120364119898,\n \"acc_norm\": 0.4830188679245283,\n \"acc_norm_stderr\": 0.030755120364119898\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.375,\n \"acc_stderr\": 0.04048439222695598,\n \"acc_norm\": 0.375,\n \"acc_norm_stderr\": 0.04048439222695598\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4393063583815029,\n \"acc_stderr\": 0.03784271932887467,\n \"acc_norm\": 0.4393063583815029,\n \"acc_norm_stderr\": 0.03784271932887467\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.04724007352383889,\n \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.04724007352383889\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.3829787234042553,\n \"acc_stderr\": 0.03177821250236922,\n \"acc_norm\": 0.3829787234042553,\n \"acc_norm_stderr\": 0.03177821250236922\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n \"acc_stderr\": 0.04303684033537315,\n \"acc_norm\": 0.2982456140350877,\n \"acc_norm_stderr\": 0.04303684033537315\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.42758620689655175,\n \"acc_stderr\": 0.041227371113703316,\n \"acc_norm\": 0.42758620689655175,\n \"acc_norm_stderr\": 0.041227371113703316\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3148148148148148,\n \"acc_stderr\": 0.023919984164047732,\n \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.023919984164047732\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.35714285714285715,\n \"acc_stderr\": 0.04285714285714281,\n \"acc_norm\": 0.35714285714285715,\n \"acc_norm_stderr\": 0.04285714285714281\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5225806451612903,\n \"acc_stderr\": 0.02841498501970786,\n \"acc_norm\": 0.5225806451612903,\n \"acc_norm_stderr\": 0.02841498501970786\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4088669950738916,\n \"acc_stderr\": 0.034590588158832314,\n \"acc_norm\": 0.4088669950738916,\n \"acc_norm_stderr\": 0.034590588158832314\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.5333333333333333,\n \"acc_stderr\": 0.038956580652718446,\n \"acc_norm\": 0.5333333333333333,\n \"acc_norm_stderr\": 0.038956580652718446\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.5303030303030303,\n \"acc_stderr\": 0.03555804051763929,\n \"acc_norm\": 0.5303030303030303,\n \"acc_norm_stderr\": 0.03555804051763929\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.6010362694300518,\n \"acc_stderr\": 0.03533999094065696,\n \"acc_norm\": 0.6010362694300518,\n \"acc_norm_stderr\": 0.03533999094065696\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.43333333333333335,\n \"acc_stderr\": 0.02512465352588513,\n \"acc_norm\": 0.43333333333333335,\n \"acc_norm_stderr\": 0.02512465352588513\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.25555555555555554,\n \"acc_stderr\": 0.02659393910184405,\n \"acc_norm\": 0.25555555555555554,\n \"acc_norm_stderr\": 0.02659393910184405\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.4495798319327731,\n \"acc_stderr\": 0.03231293497137707,\n \"acc_norm\": 0.4495798319327731,\n \"acc_norm_stderr\": 0.03231293497137707\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.0386155754625517,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.0386155754625517\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.5614678899082569,\n \"acc_stderr\": 0.021274713073954572,\n \"acc_norm\": 0.5614678899082569,\n \"acc_norm_stderr\": 0.021274713073954572\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4398148148148148,\n \"acc_stderr\": 0.03385177976044811,\n \"acc_norm\": 0.4398148148148148,\n \"acc_norm_stderr\": 0.03385177976044811\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.5147058823529411,\n \"acc_stderr\": 0.03507793834791324,\n \"acc_norm\": 0.5147058823529411,\n \"acc_norm_stderr\": 0.03507793834791324\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.6033755274261603,\n \"acc_stderr\": 0.03184399873811224,\n \"acc_norm\": 0.6033755274261603,\n \"acc_norm_stderr\": 0.03184399873811224\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4977578475336323,\n \"acc_stderr\": 0.033557465352232634,\n \"acc_norm\": 0.4977578475336323,\n \"acc_norm_stderr\": 0.033557465352232634\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5114503816793893,\n \"acc_stderr\": 0.043841400240780176,\n \"acc_norm\": 0.5114503816793893,\n \"acc_norm_stderr\": 0.043841400240780176\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6033057851239669,\n \"acc_stderr\": 0.044658697805310094,\n \"acc_norm\": 0.6033057851239669,\n \"acc_norm_stderr\": 0.044658697805310094\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5277777777777778,\n \"acc_stderr\": 0.048262172941398944,\n \"acc_norm\": 0.5277777777777778,\n \"acc_norm_stderr\": 0.048262172941398944\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.44171779141104295,\n \"acc_stderr\": 0.03901591825836184,\n \"acc_norm\": 0.44171779141104295,\n \"acc_norm_stderr\": 0.03901591825836184\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.26785714285714285,\n \"acc_stderr\": 0.04203277291467762,\n \"acc_norm\": 0.26785714285714285,\n \"acc_norm_stderr\": 0.04203277291467762\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6699029126213593,\n \"acc_stderr\": 0.046561471100123514,\n \"acc_norm\": 0.6699029126213593,\n \"acc_norm_stderr\": 0.046561471100123514\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6709401709401709,\n \"acc_stderr\": 0.03078232157768817,\n \"acc_norm\": 0.6709401709401709,\n \"acc_norm_stderr\": 0.03078232157768817\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5504469987228607,\n \"acc_stderr\": 0.017788725283507337,\n \"acc_norm\": 0.5504469987228607,\n \"acc_norm_stderr\": 0.017788725283507337\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.48265895953757226,\n \"acc_stderr\": 0.026902900458666647,\n \"acc_norm\": 0.48265895953757226,\n \"acc_norm_stderr\": 0.026902900458666647\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24804469273743016,\n \"acc_stderr\": 0.01444415780826144,\n \"acc_norm\": 0.24804469273743016,\n \"acc_norm_stderr\": 0.01444415780826144\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5620915032679739,\n \"acc_stderr\": 0.02840830202033269,\n \"acc_norm\": 0.5620915032679739,\n \"acc_norm_stderr\": 0.02840830202033269\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5048231511254019,\n \"acc_stderr\": 0.028396770444111298,\n \"acc_norm\": 0.5048231511254019,\n \"acc_norm_stderr\": 0.028396770444111298\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.45987654320987653,\n \"acc_stderr\": 0.02773102275353928,\n \"acc_norm\": 0.45987654320987653,\n \"acc_norm_stderr\": 0.02773102275353928\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.37943262411347517,\n \"acc_stderr\": 0.028947338851614105,\n \"acc_norm\": 0.37943262411347517,\n \"acc_norm_stderr\": 0.028947338851614105\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3324641460234681,\n \"acc_stderr\": 0.012032022332260507,\n \"acc_norm\": 0.3324641460234681,\n \"acc_norm_stderr\": 0.012032022332260507\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.4007352941176471,\n \"acc_stderr\": 0.029768263528933102,\n \"acc_norm\": 0.4007352941176471,\n \"acc_norm_stderr\": 0.029768263528933102\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.019835176484375376,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.019835176484375376\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5363636363636364,\n \"acc_stderr\": 0.04776449162396197,\n \"acc_norm\": 0.5363636363636364,\n \"acc_norm_stderr\": 0.04776449162396197\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.5387755102040817,\n \"acc_stderr\": 0.031912820526692774,\n \"acc_norm\": 0.5387755102040817,\n \"acc_norm_stderr\": 0.031912820526692774\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.582089552238806,\n \"acc_stderr\": 0.034875586404620636,\n \"acc_norm\": 0.582089552238806,\n \"acc_norm_stderr\": 0.034875586404620636\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.40963855421686746,\n \"acc_stderr\": 0.03828401115079023,\n \"acc_norm\": 0.40963855421686746,\n \"acc_norm_stderr\": 0.03828401115079023\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.5497076023391813,\n \"acc_stderr\": 0.038158273659132366,\n \"acc_norm\": 0.5497076023391813,\n \"acc_norm_stderr\": 0.038158273659132366\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2521419828641371,\n \"mc1_stderr\": 0.015201522246299979,\n \"mc2\": 0.4002214148044727,\n \"mc2_stderr\": 0.014908452990717655\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5935280189423836,\n \"acc_stderr\": 0.013804448697753376\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1561789234268385,\n \"acc_stderr\": 0.00999950936975745\n }\n}\n```", "repo_url": "https://huggingface.co/KnutJaegersberg/Deita-Qwen-1_8B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_17T15_12_59.171599", "path": ["**/details_harness|arc:challenge|25_2024-01-17T15-12-59.171599.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-17T15-12-59.171599.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_17T15_12_59.171599", "path": ["**/details_harness|gsm8k|5_2024-01-17T15-12-59.171599.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-17T15-12-59.171599.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_17T15_12_59.171599", "path": ["**/details_harness|hellaswag|10_2024-01-17T15-12-59.171599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-17T15-12-59.171599.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_17T15_12_59.171599", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-17T15-12-59.171599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-17T15-12-59.171599.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-17T15-12-59.171599.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_17T15_12_59.171599", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T15-12-59.171599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T15-12-59.171599.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_17T15_12_59.171599", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-17T15-12-59.171599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-17T15-12-59.171599.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_17T15_12_59.171599", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-17T15-12-59.171599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-17T15-12-59.171599.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_17T15_12_59.171599", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T15-12-59.171599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T15-12-59.171599.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_17T15_12_59.171599", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T15-12-59.171599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T15-12-59.171599.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_17T15_12_59.171599", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-17T15-12-59.171599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-17T15-12-59.171599.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_17T15_12_59.171599", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T15-12-59.171599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T15-12-59.171599.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_17T15_12_59.171599", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T15-12-59.171599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T15-12-59.171599.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_17T15_12_59.171599", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T15-12-59.171599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T15-12-59.171599.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_17T15_12_59.171599", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T15-12-59.171599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T15-12-59.171599.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_17T15_12_59.171599", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-17T15-12-59.171599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-17T15-12-59.171599.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_17T15_12_59.171599", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-17T15-12-59.171599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-17T15-12-59.171599.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_17T15_12_59.171599", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T15-12-59.171599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T15-12-59.171599.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_17T15_12_59.171599", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-17T15-12-59.171599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-17T15-12-59.171599.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_17T15_12_59.171599", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T15-12-59.171599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T15-12-59.171599.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_17T15_12_59.171599", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T15-12-59.171599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T15-12-59.171599.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_17T15_12_59.171599", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T15-12-59.171599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T15-12-59.171599.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_17T15_12_59.171599", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-17T15-12-59.171599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-17T15-12-59.171599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_17T15_12_59.171599", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T15-12-59.171599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T15-12-59.171599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_17T15_12_59.171599", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T15-12-59.171599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T15-12-59.171599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_17T15_12_59.171599", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T15-12-59.171599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T15-12-59.171599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_17T15_12_59.171599", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T15-12-59.171599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T15-12-59.171599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_17T15_12_59.171599", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T15-12-59.171599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T15-12-59.171599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_17T15_12_59.171599", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T15-12-59.171599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T15-12-59.171599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_17T15_12_59.171599", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T15-12-59.171599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T15-12-59.171599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_17T15_12_59.171599", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T15-12-59.171599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T15-12-59.171599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_17T15_12_59.171599", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T15-12-59.171599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T15-12-59.171599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_17T15_12_59.171599", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T15-12-59.171599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T15-12-59.171599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_17T15_12_59.171599", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T15-12-59.171599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T15-12-59.171599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_17T15_12_59.171599", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T15-12-59.171599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T15-12-59.171599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_17T15_12_59.171599", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T15-12-59.171599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T15-12-59.171599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_17T15_12_59.171599", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T15-12-59.171599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T15-12-59.171599.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_17T15_12_59.171599", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-17T15-12-59.171599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-17T15-12-59.171599.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_17T15_12_59.171599", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T15-12-59.171599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T15-12-59.171599.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_17T15_12_59.171599", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-17T15-12-59.171599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-17T15-12-59.171599.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_17T15_12_59.171599", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T15-12-59.171599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T15-12-59.171599.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_17T15_12_59.171599", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T15-12-59.171599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T15-12-59.171599.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_17T15_12_59.171599", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T15-12-59.171599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T15-12-59.171599.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_17T15_12_59.171599", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-17T15-12-59.171599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-17T15-12-59.171599.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_17T15_12_59.171599", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-17T15-12-59.171599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-17T15-12-59.171599.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_17T15_12_59.171599", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T15-12-59.171599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T15-12-59.171599.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_17T15_12_59.171599", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T15-12-59.171599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T15-12-59.171599.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_17T15_12_59.171599", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T15-12-59.171599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T15-12-59.171599.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_17T15_12_59.171599", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T15-12-59.171599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T15-12-59.171599.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_17T15_12_59.171599", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-17T15-12-59.171599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-17T15-12-59.171599.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_17T15_12_59.171599", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-17T15-12-59.171599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-17T15-12-59.171599.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_17T15_12_59.171599", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-17T15-12-59.171599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-17T15-12-59.171599.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_17T15_12_59.171599", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T15-12-59.171599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T15-12-59.171599.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_17T15_12_59.171599", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-17T15-12-59.171599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-17T15-12-59.171599.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_17T15_12_59.171599", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T15-12-59.171599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T15-12-59.171599.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_17T15_12_59.171599", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T15-12-59.171599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T15-12-59.171599.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_17T15_12_59.171599", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-17T15-12-59.171599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-17T15-12-59.171599.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_17T15_12_59.171599", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-17T15-12-59.171599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-17T15-12-59.171599.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_17T15_12_59.171599", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-17T15-12-59.171599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-17T15-12-59.171599.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_17T15_12_59.171599", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T15-12-59.171599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T15-12-59.171599.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_17T15_12_59.171599", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-17T15-12-59.171599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-17T15-12-59.171599.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_17T15_12_59.171599", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-17T15-12-59.171599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-17T15-12-59.171599.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_17T15_12_59.171599", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-17T15-12-59.171599.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-17T15-12-59.171599.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_17T15_12_59.171599", "path": ["**/details_harness|winogrande|5_2024-01-17T15-12-59.171599.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-17T15-12-59.171599.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_17T15_12_59.171599", "path": ["results_2024-01-17T15-12-59.171599.parquet"]}, {"split": "latest", "path": ["results_2024-01-17T15-12-59.171599.parquet"]}]}]} | 2024-01-17T15:15:37+00:00 |
329cc063c1518678f55887a723a527ef1a35e8cc |
```
CCI-Data
SkyPile-150B
TeleChat-PTD
WebText-cn
WuDaoCorpus2.0
wangan
yayi2_pretrain_data
```
整合+minhash去重了一波,最终得到550B中文预训练语料
| genggui001/gg_zh_v1_550B | [
"task_categories:text-generation",
"size_categories:100B<n<1T",
"language:zh",
"region:us"
] | 2024-01-17T15:17:57+00:00 | {"language": ["zh"], "size_categories": ["100B<n<1T"], "task_categories": ["text-generation"]} | 2024-01-20T10:47:46+00:00 |
2797e1b70ec78d38f85cf0dff225a87c7d395937 | bertbsb/Herbertbetto | [
"license:openrail",
"region:us"
] | 2024-01-17T15:26:37+00:00 | {"license": "openrail"} | 2024-01-17T15:31:58+00:00 |
|
9305db165c1536cae36a0ee7115de7e2cd7fb671 |
# Dataset of micaiah (Fire Emblem)
This is the dataset of micaiah (Fire Emblem), containing 500 images and their tags.
The core tags of this character are `long_hair, yellow_eyes, bangs, grey_hair, ribbon, hair_ribbon, half_updo, breasts, white_hair, medium_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 769.48 MiB | [Download](https://huggingface.co/datasets/CyberHarem/micaiah_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 397.27 MiB | [Download](https://huggingface.co/datasets/CyberHarem/micaiah_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1237 | 854.64 MiB | [Download](https://huggingface.co/datasets/CyberHarem/micaiah_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 661.63 MiB | [Download](https://huggingface.co/datasets/CyberHarem/micaiah_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1237 | 1.22 GiB | [Download](https://huggingface.co/datasets/CyberHarem/micaiah_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/micaiah_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 30 |  |  |  |  |  | official_alternate_costume, red_bikini, 1girl, solo, bare_shoulders, cleavage, hair_flower, navel, bikini_skirt, blue_scarf, looking_at_viewer, red_gloves, collarbone, open_mouth, bird, blush, groin, :d, front-tie_bikini_top, towel, cowboy_shot, simple_background, miniskirt, outdoors, sky, fingerless_gloves, water |
| 1 | 6 |  |  |  |  |  | 1girl, blush, nipples, solo, collarbone, groin, looking_at_viewer, navel, pussy, smile, simple_background, ass_visible_through_thighs, completely_nude, white_background |
| 2 | 7 |  |  |  |  |  | 1girl, bangle, bare_shoulders, belt, black_gloves, black_pantyhose, blue_scarf, elbow_gloves, fingerless_gloves, side_slit, simple_background, sleeveless_dress, solo, bird, boots, smile, white_background |
| 3 | 5 |  |  |  |  |  | 1girl, bangle, bare_shoulders, black_gloves, black_pantyhose, blue_scarf, cowboy_shot, elbow_gloves, fingerless_gloves, side_slit, simple_background, sleeveless_dress, solo, white_background, belt, looking_at_viewer, smile, blush, hand_on_own_chest |
| 4 | 5 |  |  |  |  |  | 1girl, bare_shoulders, black_gloves, blue_scarf, elbow_gloves, fingerless_gloves, simple_background, sleeveless_dress, solo, upper_body, bangle, smile, white_background, bird_on_hand |
| 5 | 6 |  |  |  |  |  | 1girl, bare_shoulders, blue_cape, simple_background, sleeveless_dress, solo, turtleneck_dress, bangle, looking_at_viewer, smile, elbow_gloves, fingerless_gloves, black_pantyhose |
| 6 | 21 |  |  |  |  |  | 1girl, bare_shoulders, solo, jewelry, looking_at_viewer, sleeveless_dress, smile, official_alternate_costume, simple_background, turtleneck_dress, upper_body, white_background, white_dress, flower, wedding_dress, blush, bouquet, holding, open_mouth, white_gloves |
| 7 | 6 |  |  |  |  |  | 1girl, bangle, bare_shoulders, black_dress, black_gloves, bridal_gauntlets, circlet, official_alternate_costume, side_slit, sleeveless_dress, solo, turtleneck_dress, smile, earrings, elbow_gloves, fur-trimmed_coat, looking_at_viewer, red_cape, cowboy_shot, full_body, red_coat, simple_background |
| 8 | 9 |  |  |  |  |  | 1girl, bare_shoulders, elbow_gloves, gradient_clothes, official_alternate_costume, shiny_clothes, solo, black_gloves, short_dress, looking_at_viewer, simple_background, sleeveless_dress, torn_cape, bird, hair_bow, bangle, black_dress, grey_background, pantyhose, shiny_hair, skirt, black_ribbon, smile, thigh_boots, turtleneck |
| 9 | 5 |  |  |  |  |  | 1girl, bare_shoulders, circlet, long_sleeves, red_cape, solo, official_alternate_costume, simple_background, white_background, bangle, bridal_gauntlets, detached_sleeves, full_body, open_mouth, sandals, smile, turtleneck_dress, white_dress, magic, sleeveless_dress |
| 10 | 5 |  |  |  |  |  | 1boy, 1girl, blue_scarf, blush, hetero, mosaic_censoring, penis, solo_focus, bare_shoulders, cum_in_mouth, fellatio, from_side, sleeveless_dress, upper_body, brick_wall, gloves, heart, nipples, nude, pink_background, profile, simple_background, smile, tears |
| 11 | 30 |  |  |  |  |  | 1girl, blush, nipples, 1boy, hetero, sex, solo_focus, open_mouth, vaginal, navel, penis, sweat, spread_legs, collarbone, pov, smile, large_breasts, looking_at_viewer, completely_nude, mosaic_censoring, cum_in_pussy, cowgirl_position, bed_sheet, birthmark, on_back |
| 12 | 16 |  |  |  |  |  | 1girl, yukata, butterfly_print, official_alternate_costume, solo, blush, obi, looking_at_viewer, smile, wide_sleeves, simple_background, upper_body, holding, open_mouth, twitter_username, white_background |
| 13 | 7 |  |  |  |  |  | 1girl, cleavage, looking_at_viewer, solo, cosplay, alternate_costume, blue_cape, bodystocking, covered_navel, simple_background, skin_tight, smile, bracelet, white_background, bridal_gauntlets, full_body, open_book, open_mouth |
| 14 | 9 |  |  |  |  |  | 1girl, cleavage, crop_top, looking_at_viewer, midriff, navel, short_shorts, smile, tied_shirt, alternate_costume, blush, checkered_shirt, collarbone, denim_shorts, short_sleeves, solo, beer_mug, front-tie_top, holding_cup, large_breasts, blue_shorts, no_gloves, plaid, twitter_username, cowboy_shot |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | official_alternate_costume | red_bikini | 1girl | solo | bare_shoulders | cleavage | hair_flower | navel | bikini_skirt | blue_scarf | looking_at_viewer | red_gloves | collarbone | open_mouth | bird | blush | groin | :d | front-tie_bikini_top | towel | cowboy_shot | simple_background | miniskirt | outdoors | sky | fingerless_gloves | water | nipples | pussy | smile | ass_visible_through_thighs | completely_nude | white_background | bangle | belt | black_gloves | black_pantyhose | elbow_gloves | side_slit | sleeveless_dress | boots | hand_on_own_chest | upper_body | bird_on_hand | blue_cape | turtleneck_dress | jewelry | white_dress | flower | wedding_dress | bouquet | holding | white_gloves | black_dress | bridal_gauntlets | circlet | earrings | fur-trimmed_coat | red_cape | full_body | red_coat | gradient_clothes | shiny_clothes | short_dress | torn_cape | hair_bow | grey_background | pantyhose | shiny_hair | skirt | black_ribbon | thigh_boots | turtleneck | long_sleeves | detached_sleeves | sandals | magic | 1boy | hetero | mosaic_censoring | penis | solo_focus | cum_in_mouth | fellatio | from_side | brick_wall | gloves | heart | nude | pink_background | profile | tears | sex | vaginal | sweat | spread_legs | pov | large_breasts | cum_in_pussy | cowgirl_position | bed_sheet | birthmark | on_back | yukata | butterfly_print | obi | wide_sleeves | twitter_username | cosplay | alternate_costume | bodystocking | covered_navel | skin_tight | bracelet | open_book | crop_top | midriff | short_shorts | tied_shirt | checkered_shirt | denim_shorts | short_sleeves | beer_mug | front-tie_top | holding_cup | blue_shorts | no_gloves | plaid |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:-----------------------------|:-------------|:--------|:-------|:-----------------|:-----------|:--------------|:--------|:---------------|:-------------|:--------------------|:-------------|:-------------|:-------------|:-------|:--------|:--------|:-----|:-----------------------|:--------|:--------------|:--------------------|:------------|:-----------|:------|:--------------------|:--------|:----------|:--------|:--------|:-----------------------------|:------------------|:-------------------|:---------|:-------|:---------------|:------------------|:---------------|:------------|:-------------------|:--------|:--------------------|:-------------|:---------------|:------------|:-------------------|:----------|:--------------|:---------|:----------------|:----------|:----------|:---------------|:--------------|:-------------------|:----------|:-----------|:-------------------|:-----------|:------------|:-----------|:-------------------|:----------------|:--------------|:------------|:-----------|:------------------|:------------|:-------------|:--------|:---------------|:--------------|:-------------|:---------------|:-------------------|:----------|:--------|:-------|:---------|:-------------------|:--------|:-------------|:---------------|:-----------|:------------|:-------------|:---------|:--------|:-------|:------------------|:----------|:--------|:------|:----------|:--------|:--------------|:------|:----------------|:---------------|:-------------------|:------------|:------------|:----------|:---------|:------------------|:------|:---------------|:-------------------|:----------|:--------------------|:---------------|:----------------|:-------------|:-----------|:------------|:-----------|:----------|:---------------|:-------------|:------------------|:---------------|:----------------|:-----------|:----------------|:--------------|:--------------|:------------|:--------|
| 0 | 30 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | | | X | X | | | | X | | | X | | X | | | X | X | | | | | X | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 7 |  |  |  |  |  | | | X | X | X | | | | | X | | | | | X | | | | | | | X | | | | X | | | | X | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | | | X | X | X | | | | | X | X | | | | | X | | | | | X | X | | | | X | | | | X | | | X | X | X | X | X | X | X | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | | | X | X | X | | | | | X | | | | | | | | | | | | X | | | | X | | | | X | | | X | X | | X | | X | | X | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 6 |  |  |  |  |  | | | X | X | X | | | | | | X | | | | | | | | | | | X | | | | X | | | | X | | | | X | | | X | X | | X | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 21 |  |  |  |  |  | X | | X | X | X | | | | | | X | | | X | | X | | | | | | X | | | | | | | | X | | | X | | | | | | | X | | | X | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 6 |  |  |  |  |  | X | | X | X | X | | | | | | X | | | | | | | | | | X | X | | | | | | | | X | | | | X | | X | | X | X | X | | | | | | X | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 9 |  |  |  |  |  | X | | X | X | X | | | | | | X | | | | X | | | | | | | X | | | | | | | | X | | | | X | | X | | X | | X | | | | | | | | | | | | | | X | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 9 | 5 |  |  |  |  |  | X | | X | X | X | | | | | | | | | X | | | | | | | | X | | | | | | | | X | | | X | X | | | | | | X | | | | | | X | | X | | | | | | | X | X | | | X | X | | | | | | | | | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 10 | 5 |  |  |  |  |  | | | X | | X | | | | | X | | | | | | X | | | | | | X | | | | | | X | | X | | | | | | | | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 11 | 30 |  |  |  |  |  | | | X | | | | | X | | | X | | X | X | | X | | | | | | | | | | | | X | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | |
| 12 | 16 |  |  |  |  |  | X | | X | X | | | | | | | X | | | X | | X | | | | | | X | | | | | | | | X | | | X | | | | | | | | | | X | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | |
| 13 | 7 |  |  |  |  |  | | | X | X | | X | | | | | X | | | X | | | | | | | | X | | | | | | | | X | | | X | | | | | | | | | | | | X | | | | | | | | | | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | |
| 14 | 9 |  |  |  |  |  | | | X | X | | X | | X | | | X | | X | | | X | | | | | X | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | X | | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/micaiah_fireemblem | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-17T15:27:38+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-17T19:29:23+00:00 |
16ccf52e3e19a35948a5b62d846edef44dc57f2f |
# Dataset of ninian (Fire Emblem)
This is the dataset of ninian (Fire Emblem), containing 388 images and their tags.
The core tags of this character are `long_hair, blue_hair, red_eyes, hair_ornament, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 388 | 458.90 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ninian_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 388 | 280.88 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ninian_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 811 | 533.95 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ninian_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 388 | 414.68 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ninian_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 811 | 717.76 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ninian_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/ninian_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, bare_shoulders, dress, looking_at_viewer, simple_background, smile, solo, cape, full_body, white_background |
| 1 | 11 |  |  |  |  |  | 1girl, bare_shoulders, solo, looking_at_viewer, smile, cape, blue_dress, blush |
| 2 | 5 |  |  |  |  |  | 1girl, bare_shoulders, dress, looking_at_viewer, solo, white_background, simple_background, smile |
| 3 | 43 |  |  |  |  |  | 1girl, bare_shoulders, bride, wedding_dress, smile, solo, white_dress, bridal_veil, flower, looking_at_viewer, bouquet, gloves, strapless_dress |
| 4 | 8 |  |  |  |  |  | 1girl, bangs, bare_shoulders, full_body, long_dress, solo, floating_object, medium_breasts, open_mouth, shiny_hair, gradient_clothes, stone, turtleneck, white_background, aqua_hair, blue_dress, looking_at_viewer, snowflakes, cape, dark_aura, glowing_eyes, simple_background, transparent_background |
| 5 | 8 |  |  |  |  |  | 1boy, 1girl, hetero, blush, penis, sex, solo_focus, nipples, open_mouth, vaginal, completely_nude, spread_legs, cum_in_pussy, large_breasts, lying, missionary, mosaic_censoring |
| 6 | 15 |  |  |  |  |  | hetero, multiple_boys, multiple_penises, 1girl, nipples, solo_focus, vaginal, large_breasts, gangbang, cum_in_pussy, mosaic_censoring, double_handjob, torn_clothes, blush, bukkake, dress, facial, fellatio, nude, rape, straddling |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | dress | looking_at_viewer | simple_background | smile | solo | cape | full_body | white_background | blue_dress | blush | bride | wedding_dress | white_dress | bridal_veil | flower | bouquet | gloves | strapless_dress | bangs | long_dress | floating_object | medium_breasts | open_mouth | shiny_hair | gradient_clothes | stone | turtleneck | aqua_hair | snowflakes | dark_aura | glowing_eyes | transparent_background | 1boy | hetero | penis | sex | solo_focus | nipples | vaginal | completely_nude | spread_legs | cum_in_pussy | large_breasts | lying | missionary | mosaic_censoring | multiple_boys | multiple_penises | gangbang | double_handjob | torn_clothes | bukkake | facial | fellatio | nude | rape | straddling |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:--------|:--------------------|:--------------------|:--------|:-------|:-------|:------------|:-------------------|:-------------|:--------|:--------|:----------------|:--------------|:--------------|:---------|:----------|:---------|:------------------|:--------|:-------------|:------------------|:-----------------|:-------------|:-------------|:-------------------|:--------|:-------------|:------------|:-------------|:------------|:---------------|:-------------------------|:-------|:---------|:--------|:------|:-------------|:----------|:----------|:------------------|:--------------|:---------------|:----------------|:--------|:-------------|:-------------------|:----------------|:-------------------|:-----------|:-----------------|:---------------|:----------|:---------|:-----------|:-------|:-------|:-------------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 11 |  |  |  |  |  | X | X | | X | | X | X | X | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 43 |  |  |  |  |  | X | X | | X | | X | X | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 8 |  |  |  |  |  | X | X | | X | X | | X | X | X | X | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 8 |  |  |  |  |  | X | | | | | | | | | | | X | | | | | | | | | | | | | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | |
| 6 | 15 |  |  |  |  |  | X | | X | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | X | | | X | X | X | | | X | X | | | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/ninian_fireemblem | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-17T15:28:22+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-17T16:46:48+00:00 |
0681f61cf05e9cc28186e92f046ac7b9caa9f6b5 | Tails665/milini | [
"license:openrail",
"region:us"
] | 2024-01-17T15:29:32+00:00 | {"license": "openrail"} | 2024-01-17T15:30:43+00:00 |
|
b1c847c3dac0f213883b8c6717ac80827de7adda | SamagraDataGov/mistral_test_sample | [
"license:mit",
"region:us"
] | 2024-01-17T15:30:14+00:00 | {"license": "mit"} | 2024-01-17T15:38:42+00:00 |
|
558d40ef2c32493de9068d5e2b6de7b99c892df4 | alexpanick/friends | [
"license:openrail",
"region:us"
] | 2024-01-17T15:31:59+00:00 | {"license": "openrail"} | 2024-01-18T13:44:27+00:00 |
|
709a78ddb52ec490593d2c0cd5baf0021526d1b6 | MatsuoDochiai/Lanjax | [
"license:openrail",
"region:us"
] | 2024-01-17T15:34:52+00:00 | {"license": "openrail"} | 2024-01-17T15:35:43+00:00 |
|
6c4e43f0aca28eb96f6ab0501f37541903771d55 |
The dataset is available at: https://www.mpi-inf.mpg.de/departments/computer-vision-and-machine-learning/research/vision-and-language/visual-turing-challenge/

```
@INPROCEEDINGS{malinowski2014nips,
author = {Malinowski, Mateusz and Fritz, Mario},
title = {A Multi-World Approach to Question Answering about Real-World Scenes based on Uncertain Input},
booktitle = {Advances in Neural Information Processing Systems 27},
editor = {Z. Ghahramani and M. Welling and C. Cortes and N.D. Lawrence and K.Q. Weinberger},
pages = {1682--1690},
year = {2014},
publisher = {Curran Associates, Inc.},
url = {http://papers.nips.cc/paper/5411-a-multi-world-approach-to-question-answering-about-real-world-scenes-based-on-uncertain-input.pdf}
}
``` | Andyrasika/VQA-Dataset | [
"size_categories:100K<n<1M",
"language:en",
"license:mit",
"VQA",
"region:us"
] | 2024-01-17T15:37:46+00:00 | {"language": ["en"], "license": "mit", "size_categories": ["100K<n<1M"], "pretty_name": "VQA ", "dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}, {"name": "image_id", "dtype": "string"}, {"name": "label", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 700662, "num_examples": 9974}, {"name": "test", "num_bytes": 174412, "num_examples": 2494}], "download_size": 299109, "dataset_size": 875074}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "tags": ["VQA"]} | 2024-01-17T15:45:12+00:00 |
05f0fadfc3fd1538291946c6946dcdc0f52ab1fe | MatsuoDochiai/Sumpomni | [
"license:openrail",
"region:us"
] | 2024-01-17T15:45:53+00:00 | {"license": "openrail"} | 2024-01-17T15:46:45+00:00 |
|
aaa28ba1e5005c53a44745fb9ade41713ef6d757 | senhorsapo/spider | [
"license:openrail",
"region:us"
] | 2024-01-17T16:01:05+00:00 | {"license": "openrail"} | 2024-01-17T16:01:15+00:00 |
|
4967422cc91b92359f45e002a999bcb4ff865988 |
# Dataset of sothis (Fire Emblem)
This is the dataset of sothis (Fire Emblem), containing 433 images and their tags.
The core tags of this character are `green_hair, long_hair, braid, green_eyes, twin_braids, ribbon_braid, pointy_ears, ribbon, hair_ornament, hair_ribbon, side_braid`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 433 | 548.51 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sothis_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 433 | 325.30 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sothis_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 939 | 654.56 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sothis_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 433 | 489.94 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sothis_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 939 | 900.72 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sothis_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/sothis_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 11 |  |  |  |  |  | 1girl, closed_mouth, simple_background, solo, tiara, upper_body, smile, white_background, looking_at_viewer |
| 1 | 25 |  |  |  |  |  | 1girl, dress, solo, tiara, barefoot, closed_mouth, full_body, smile, anklet, very_long_hair, simple_background, looking_at_viewer |
| 2 | 7 |  |  |  |  |  | 1girl, closed_mouth, sitting, solo, tiara, dress, smile, very_long_hair, throne |
| 3 | 5 |  |  |  |  |  | 1girl, christmas_ornaments, fur_trim, simple_background, smile, solo, tiara, closed_mouth, dress, full_body, white_background, very_long_hair |
| 4 | 9 |  |  |  |  |  | 1girl, fur_trim, gift_box, tiara, christmas_ornaments, smile, solo, dress, closed_mouth, holding, open_mouth |
| 5 | 8 |  |  |  |  |  | 1girl, bangs, cleavage, cosplay, official_alternate_costume, solo, tiara, medium_hair, clothing_cutout, hair_between_eyes, large_breasts, looking_at_viewer, blue_dress, blush, bare_shoulders, closed_mouth, upper_body |
| 6 | 9 |  |  |  |  |  | 2girls, tiara, dress, simple_background, white_background, open_mouth, smile, closed_mouth |
| 7 | 12 |  |  |  |  |  | halloween_costume, witch_hat, smile, 1girl, holding, striped, black_dress, black_headwear, lollipop, looking_at_viewer, official_alternate_costume, open_mouth, puffy_short_sleeves, broom, 1boy, solo |
| 8 | 18 |  |  |  |  |  | 1girl, hetero, nipples, penis, solo_focus, pussy, sex, 1boy, vaginal, small_breasts, tiara, uncensored, completely_nude, cum, navel, spread_legs |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | closed_mouth | simple_background | solo | tiara | upper_body | smile | white_background | looking_at_viewer | dress | barefoot | full_body | anklet | very_long_hair | sitting | throne | christmas_ornaments | fur_trim | gift_box | holding | open_mouth | bangs | cleavage | cosplay | official_alternate_costume | medium_hair | clothing_cutout | hair_between_eyes | large_breasts | blue_dress | blush | bare_shoulders | 2girls | halloween_costume | witch_hat | striped | black_dress | black_headwear | lollipop | puffy_short_sleeves | broom | 1boy | hetero | nipples | penis | solo_focus | pussy | sex | vaginal | small_breasts | uncensored | completely_nude | cum | navel | spread_legs |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:--------------------|:-------|:--------|:-------------|:--------|:-------------------|:--------------------|:--------|:-----------|:------------|:---------|:-----------------|:----------|:---------|:----------------------|:-----------|:-----------|:----------|:-------------|:--------|:-----------|:----------|:-----------------------------|:--------------|:------------------|:--------------------|:----------------|:-------------|:--------|:-----------------|:---------|:--------------------|:------------|:----------|:--------------|:-----------------|:-----------|:----------------------|:--------|:-------|:---------|:----------|:--------|:-------------|:--------|:------|:----------|:----------------|:-------------|:------------------|:------|:--------|:--------------|
| 0 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 25 |  |  |  |  |  | X | X | X | X | X | | X | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 7 |  |  |  |  |  | X | X | | X | X | | X | | | X | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | X | X | X | X | | X | X | | X | | X | | X | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 9 |  |  |  |  |  | X | X | | X | X | | X | | | X | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 8 |  |  |  |  |  | X | X | | X | X | X | | | X | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 9 |  |  |  |  |  | | X | X | | X | | X | X | | X | | | | | | | | | | | X | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 12 |  |  |  |  |  | X | | | X | | | X | | X | | | | | | | | | | | X | X | | | | X | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | |
| 8 | 18 |  |  |  |  |  | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/sothis_fireemblem | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-17T16:03:46+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-17T17:37:09+00:00 |
bbaae6932323ff00d0245588fc080cccc0e9f6bd |
# Dataset of liz (Fire Emblem)
This is the dataset of liz (Fire Emblem), containing 321 images and their tags.
The core tags of this character are `blonde_hair, twintails, blue_eyes, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 321 | 354.52 MiB | [Download](https://huggingface.co/datasets/CyberHarem/liz_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 321 | 225.56 MiB | [Download](https://huggingface.co/datasets/CyberHarem/liz_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 668 | 448.27 MiB | [Download](https://huggingface.co/datasets/CyberHarem/liz_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 321 | 323.31 MiB | [Download](https://huggingface.co/datasets/CyberHarem/liz_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 668 | 604.65 MiB | [Download](https://huggingface.co/datasets/CyberHarem/liz_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/liz_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 8 |  |  |  |  |  | 1girl, dress, smile, solo, corset, looking_at_viewer, open_mouth, simple_background, white_background, apron, long_hair, long_sleeves, puffy_sleeves |
| 1 | 10 |  |  |  |  |  | 1girl, dress, solo, staff, smile, apron, boots, holding, full_body, corset, white_background |
| 2 | 7 |  |  |  |  |  | 1girl, open_mouth, smile, solo, dress, hair_flower, holding, upper_body, looking_at_viewer, simple_background, white_background, basket, official_alternate_costume |
| 3 | 9 |  |  |  |  |  | 1girl, christmas, dress, santa_hat, smile, solo, santa_costume, bell, looking_at_viewer, open_mouth, holding |
| 4 | 5 |  |  |  |  |  | 1girl, blush, completely_nude, looking_at_viewer, navel, nipples, arms_behind_back, collarbone, parted_bangs, pussy, smile, solo, closed_mouth, medium_breasts, simple_background, white_background, barefoot, censored, full_body, grey_eyes, groin, long_hair, small_breasts, standing |
| 5 | 29 |  |  |  |  |  | blush, hetero, 1boy, 1girl, nipples, sex, penis, solo_focus, medium_breasts, vaginal, open_mouth, spread_legs, sweat, looking_at_viewer, completely_nude, cum_in_pussy, mosaic_censoring, navel |
| 6 | 6 |  |  |  |  |  | 1girl, ahoge, circlet, grey_eyes, medium_breasts, bodystocking, cape, long_hair, turtleneck, 1boy, bangs, bridal_gauntlets, covered_navel, holding, panties, smile, thighhighs |
| 7 | 5 |  |  |  |  |  | 1girl, arms_behind_back, blush, looking_at_viewer, medium_breasts, shibari, solo, underwear_only, crying_with_eyes_open, gagged, green_eyes, open_mouth, white_panties, crotch_rope, frills, full_body, kneeling, navel, short_hair, white_bra, white_thighhighs, yellow_panties |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | dress | smile | solo | corset | looking_at_viewer | open_mouth | simple_background | white_background | apron | long_hair | long_sleeves | puffy_sleeves | staff | boots | holding | full_body | hair_flower | upper_body | basket | official_alternate_costume | christmas | santa_hat | santa_costume | bell | blush | completely_nude | navel | nipples | arms_behind_back | collarbone | parted_bangs | pussy | closed_mouth | medium_breasts | barefoot | censored | grey_eyes | groin | small_breasts | standing | hetero | 1boy | sex | penis | solo_focus | vaginal | spread_legs | sweat | cum_in_pussy | mosaic_censoring | ahoge | circlet | bodystocking | cape | turtleneck | bangs | bridal_gauntlets | covered_navel | panties | thighhighs | shibari | underwear_only | crying_with_eyes_open | gagged | green_eyes | white_panties | crotch_rope | frills | kneeling | short_hair | white_bra | white_thighhighs | yellow_panties |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:--------|:-------|:---------|:--------------------|:-------------|:--------------------|:-------------------|:--------|:------------|:---------------|:----------------|:--------|:--------|:----------|:------------|:--------------|:-------------|:---------|:-----------------------------|:------------|:------------|:----------------|:-------|:--------|:------------------|:--------|:----------|:-------------------|:-------------|:---------------|:--------|:---------------|:-----------------|:-----------|:-----------|:------------|:--------|:----------------|:-----------|:---------|:-------|:------|:--------|:-------------|:----------|:--------------|:--------|:---------------|:-------------------|:--------|:----------|:---------------|:-------|:-------------|:--------|:-------------------|:----------------|:----------|:-------------|:----------|:-----------------|:------------------------|:---------|:-------------|:----------------|:--------------|:---------|:-----------|:-------------|:------------|:-------------------|:-----------------|
| 0 | 8 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 10 |  |  |  |  |  | X | X | X | X | X | | | | X | X | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 7 |  |  |  |  |  | X | X | X | X | | X | X | X | X | | | | | | | X | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 9 |  |  |  |  |  | X | X | X | X | | X | X | | | | | | | | | X | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | | X | X | | X | | X | X | | X | | | | | | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 29 |  |  |  |  |  | X | | | | | X | X | | | | | | | | | | | | | | | | | | | X | X | X | X | | | | | | X | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 6 |  |  |  |  |  | X | | X | | | | | | | | X | | | | | X | | | | | | | | | | | | | | | | | | | X | | | X | | | | | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | |
| 7 | 5 |  |  |  |  |  | X | | | X | | X | X | | | | | | | | | | X | | | | | | | | | X | | X | | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/liz_fireemblem | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-17T16:03:51+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-17T17:21:00+00:00 |
6e7904aef9d4ef7365237f888bbee5df800b3bc0 | eminecetin/turkishReviews-BotChat | [
"region:us"
] | 2024-01-17T16:05:29+00:00 | {"dataset_info": {"features": [{"name": "review", "dtype": "string"}, {"name": "review_length", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 1252876.2642514652, "num_examples": 3378}, {"name": "validation", "num_bytes": 139455.7357485349, "num_examples": 376}], "download_size": 896651, "dataset_size": 1392332.0}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}]} | 2024-01-17T16:05:31+00:00 |
|
23edfecef664d265e49aa44edd847b8604264a0d | # Dataset Card for "Vietnamese-Books-dedup"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | tmnam20/Vietnamese-Books-dedup | [
"region:us"
] | 2024-01-17T16:19:00+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 3546619845, "num_examples": 14485736}], "download_size": 1922215933, "dataset_size": 3546619845}} | 2024-01-17T16:23:27+00:00 |
eadfffc91e179626d0c85fc8d3b832b831a9bb25 | jxm/scifact__openai_ada2 | [
"region:us"
] | 2024-01-17T16:25:35+00:00 | {"dataset_info": {"features": [{"name": "text", "dtype": "string"}, {"name": "embeddings_A", "sequence": "float64"}], "splits": [{"name": "train", "num_bytes": 66934073, "num_examples": 5183}], "download_size": 67028968, "dataset_size": 66934073}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-17T16:25:38+00:00 |
|
e6d7731cc88dca79104226a12593a2b0f17076ec | trl-internal-testing/dolly-chatml-sft | [
"region:us"
] | 2024-01-17T16:30:26+00:00 | {"dataset_info": {"features": [{"name": "messages", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 819873, "num_examples": 1000}], "download_size": 480573, "dataset_size": 819873}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-17T16:30:26+00:00 |
|
b553d92659f27ae61785b9aa5cc3ef50f6ac5f72 | MIRKOSSSSSS/mirkolecce | [
"license:apache-2.0",
"region:us"
] | 2024-01-17T16:34:22+00:00 | {"license": "apache-2.0"} | 2024-01-17T16:34:23+00:00 |
|
277164dc7b4b4676062adcecd3be90af8ea4a950 |
# Dataset of setsuna (Fire Emblem)
This is the dataset of setsuna (Fire Emblem), containing 71 images and their tags.
The core tags of this character are `hair_over_one_eye, short_hair, blue_hair, blue_eyes, hairband`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 71 | 51.67 MiB | [Download](https://huggingface.co/datasets/CyberHarem/setsuna_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 71 | 36.31 MiB | [Download](https://huggingface.co/datasets/CyberHarem/setsuna_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 110 | 58.38 MiB | [Download](https://huggingface.co/datasets/CyberHarem/setsuna_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 71 | 47.84 MiB | [Download](https://huggingface.co/datasets/CyberHarem/setsuna_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 110 | 77.21 MiB | [Download](https://huggingface.co/datasets/CyberHarem/setsuna_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/setsuna_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, arrow_(projectile), gloves, solo, quiver, simple_background, holding_bow_(weapon), white_background |
| 1 | 6 |  |  |  |  |  | 1girl, simple_background, solo, upper_body, white_background |
| 2 | 5 |  |  |  |  |  | 1girl, fingerless_gloves, solo, upper_body, looking_at_viewer |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | arrow_(projectile) | gloves | solo | quiver | simple_background | holding_bow_(weapon) | white_background | upper_body | fingerless_gloves | looking_at_viewer |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------------|:---------|:-------|:---------|:--------------------|:-----------------------|:-------------------|:-------------|:--------------------|:--------------------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | | | |
| 1 | 6 |  |  |  |  |  | X | | | X | | X | | X | X | | |
| 2 | 5 |  |  |  |  |  | X | | | X | | | | | X | X | X |
| CyberHarem/setsuna_fireemblem | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-17T16:34:37+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-17T16:46:20+00:00 |
ba1996e296c790d4b9ae7c3dff0eaae60d26f903 |
# Dataset of nino (Fire Emblem)
This is the dataset of nino (Fire Emblem), containing 342 images and their tags.
The core tags of this character are `green_hair, short_hair, hairband, blue_eyes, purple_hairband`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 342 | 373.95 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nino_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 342 | 225.34 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nino_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 672 | 429.34 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nino_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 342 | 335.07 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nino_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 672 | 589.20 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nino_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/nino_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 38 |  |  |  |  |  | 1girl, hetero, 1boy, blush, sex, open_mouth, solo_focus, penis, vaginal, nipples, mosaic_censoring, nude, cum_in_pussy, medium_breasts, sweat |
| 1 | 12 |  |  |  |  |  | 1girl, nipples, solo, blush, looking_at_viewer, small_breasts, completely_nude, navel, open_mouth, simple_background |
| 2 | 11 |  |  |  |  |  | 1girl, cape, solo, looking_at_viewer, simple_background, upper_body, open_mouth, white_background, long_sleeves, smile, blush |
| 3 | 10 |  |  |  |  |  | 1girl, cape, open_mouth, solo, skirt, belt, looking_at_viewer, :d, boots |
| 4 | 17 |  |  |  |  |  | 1girl, cape, solo, holding_book, long_sleeves, belt, open_mouth, smile, white_skirt, looking_at_viewer, simple_background, blush, boots |
| 5 | 5 |  |  |  |  |  | 1girl, sitting, skirt, smile, solo, blush, boots, cape, looking_at_viewer, long_sleeves, pouch |
| 6 | 6 |  |  |  |  |  | 1girl, bangs, black_dress, hood_down, long_sleeves, solo, belt_pouch, feather_trim, open_mouth, shiny_hair, short_dress, simple_background, blush, looking_at_viewer, :d, boots, full_body, hooded_cape, leg_up, white_background |
| 7 | 6 |  |  |  |  |  | 1boy, 1girl, cape, couple, hetero, hug, red_hair, smile, blush, bandages, closed_eyes, gloves, white_background |
| 8 | 13 |  |  |  |  |  | 1girl, fur_trim, long_sleeves, smile, open_mouth, santa_costume, santa_hat, solo, belt, red_dress, boots, gift_box, holding, looking_at_viewer, pouch, red_headwear, sack, white_background, blush, brown_gloves, christmas_ornaments, christmas_tree, pom_pom_(clothes), red_footwear, bangs, bell, simple_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | hetero | 1boy | blush | sex | open_mouth | solo_focus | penis | vaginal | nipples | mosaic_censoring | nude | cum_in_pussy | medium_breasts | sweat | solo | looking_at_viewer | small_breasts | completely_nude | navel | simple_background | cape | upper_body | white_background | long_sleeves | smile | skirt | belt | :d | boots | holding_book | white_skirt | sitting | pouch | bangs | black_dress | hood_down | belt_pouch | feather_trim | shiny_hair | short_dress | full_body | hooded_cape | leg_up | couple | hug | red_hair | bandages | closed_eyes | gloves | fur_trim | santa_costume | santa_hat | red_dress | gift_box | holding | red_headwear | sack | brown_gloves | christmas_ornaments | christmas_tree | pom_pom_(clothes) | red_footwear | bell |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------|:-------|:--------|:------|:-------------|:-------------|:--------|:----------|:----------|:-------------------|:-------|:---------------|:-----------------|:--------|:-------|:--------------------|:----------------|:------------------|:--------|:--------------------|:-------|:-------------|:-------------------|:---------------|:--------|:--------|:-------|:-----|:--------|:---------------|:--------------|:----------|:--------|:--------|:--------------|:------------|:-------------|:---------------|:-------------|:--------------|:------------|:--------------|:---------|:---------|:------|:-----------|:-----------|:--------------|:---------|:-----------|:----------------|:------------|:------------|:-----------|:----------|:---------------|:-------|:---------------|:----------------------|:-----------------|:--------------------|:---------------|:-------|
| 0 | 38 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 12 |  |  |  |  |  | X | | | X | | X | | | | X | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 11 |  |  |  |  |  | X | | | X | | X | | | | | | | | | | X | X | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 10 |  |  |  |  |  | X | | | | | X | | | | | | | | | | X | X | | | | | X | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 17 |  |  |  |  |  | X | | | X | | X | | | | | | | | | | X | X | | | | X | X | | | X | X | | X | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | | | X | | | | | | | | | | | | X | X | | | | | X | | | X | X | X | | | X | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 6 |  |  |  |  |  | X | | | X | | X | | | | | | | | | | X | X | | | | X | | | X | X | | | | X | X | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | |
| 7 | 6 |  |  |  |  |  | X | X | X | X | | | | | | | | | | | | | | | | | | X | | X | | X | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | |
| 8 | 13 |  |  |  |  |  | X | | | X | | X | | | | | | | | | | X | X | | | | X | | | X | X | X | | X | | X | | | | X | X | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/nino_fireemblem | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-17T16:35:16+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-17T17:40:15+00:00 |
4dbd10f56c4e3e318ae504ef9973f7d24345939d | # Dataset Card for "sample_105000_rows"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | kraitans21/sample_105000_rows | [
"region:us"
] | 2024-01-17T16:43:35+00:00 | {"dataset_info": {"features": [{"name": "source_id", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "meta", "dtype": "string"}, {"name": "source", "dtype": "string"}, {"name": "updated_date", "dtype": "string"}, {"name": "created_date", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 560214821.7, "num_examples": 100000}, {"name": "eval", "num_bytes": 28010741.085, "num_examples": 5000}], "download_size": 241109440, "dataset_size": 588225562.7850001}} | 2024-01-17T16:45:44+00:00 |
e5a91b9bb53334a22ed8d0b10ab5a77944e7fbcf | mehranandi/Forfirstai | [
"license:apache-2.0",
"region:us"
] | 2024-01-17T16:46:08+00:00 | {"license": "apache-2.0"} | 2024-01-17T16:46:09+00:00 |
|
abf40902d04da11b5a57a683aad26ffe06d5a770 | TaMduluza/fire_detection | [
"license:mit",
"region:us"
] | 2024-01-17T16:48:27+00:00 | {"license": "mit"} | 2024-01-17T17:08:05+00:00 |
|
edb870c4babc8a5819d5859aca9f52c40669b67a |
# Dataset Card for Evaluation run of kaitchup/Maixtchup-4x7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [kaitchup/Maixtchup-4x7b](https://huggingface.co/kaitchup/Maixtchup-4x7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_kaitchup__Maixtchup-4x7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-17T16:47:01.392242](https://huggingface.co/datasets/open-llm-leaderboard/details_kaitchup__Maixtchup-4x7b/blob/main/results_2024-01-17T16-47-01.392242.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6144719599933052,
"acc_stderr": 0.03303924482918558,
"acc_norm": 0.6168692677516201,
"acc_norm_stderr": 0.03370135211774917,
"mc1": 0.4039167686658507,
"mc1_stderr": 0.017177276822584284,
"mc2": 0.5612826178367374,
"mc2_stderr": 0.015986434965174608
},
"harness|arc:challenge|25": {
"acc": 0.590443686006826,
"acc_stderr": 0.014370358632472439,
"acc_norm": 0.6254266211604096,
"acc_norm_stderr": 0.014144193471893454
},
"harness|hellaswag|10": {
"acc": 0.6525592511451902,
"acc_stderr": 0.004751840646730854,
"acc_norm": 0.8382792272455686,
"acc_norm_stderr": 0.003674419799353668
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.04218506215368879,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.04218506215368879
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6679245283018868,
"acc_stderr": 0.028985455652334395,
"acc_norm": 0.6679245283018868,
"acc_norm_stderr": 0.028985455652334395
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7361111111111112,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.7361111111111112,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6127167630057804,
"acc_stderr": 0.03714325906302065,
"acc_norm": 0.6127167630057804,
"acc_norm_stderr": 0.03714325906302065
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.047840607041056527,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.047840607041056527
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5234042553191489,
"acc_stderr": 0.03265019475033582,
"acc_norm": 0.5234042553191489,
"acc_norm_stderr": 0.03265019475033582
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.046854730419077895,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.046854730419077895
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.025197101074246483,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.025197101074246483
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.04343525428949098,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.04343525428949098
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6064516129032258,
"acc_stderr": 0.027791878753132274,
"acc_norm": 0.6064516129032258,
"acc_norm_stderr": 0.027791878753132274
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009181,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.02962022787479049,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.02962022787479049
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.844559585492228,
"acc_stderr": 0.02614848346915332,
"acc_norm": 0.844559585492228,
"acc_norm_stderr": 0.02614848346915332
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5871794871794872,
"acc_stderr": 0.024962683564331796,
"acc_norm": 0.5871794871794872,
"acc_norm_stderr": 0.024962683564331796
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.028897748741131143,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.028897748741131143
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6218487394957983,
"acc_stderr": 0.03149930577784906,
"acc_norm": 0.6218487394957983,
"acc_norm_stderr": 0.03149930577784906
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8,
"acc_stderr": 0.01714985851425095,
"acc_norm": 0.8,
"acc_norm_stderr": 0.01714985851425095
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.47685185185185186,
"acc_stderr": 0.034063153607115065,
"acc_norm": 0.47685185185185186,
"acc_norm_stderr": 0.034063153607115065
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.029771775228145635,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.029771775228145635
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.026750826994676166,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.026750826994676166
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6412556053811659,
"acc_stderr": 0.032190792004199956,
"acc_norm": 0.6412556053811659,
"acc_norm_stderr": 0.032190792004199956
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.732824427480916,
"acc_stderr": 0.03880848301082393,
"acc_norm": 0.732824427480916,
"acc_norm_stderr": 0.03880848301082393
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.04330043749650743,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.04330043749650743
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7116564417177914,
"acc_stderr": 0.035590395316173425,
"acc_norm": 0.7116564417177914,
"acc_norm_stderr": 0.035590395316173425
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8504273504273504,
"acc_stderr": 0.023365051491753715,
"acc_norm": 0.8504273504273504,
"acc_norm_stderr": 0.023365051491753715
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.80970625798212,
"acc_stderr": 0.014036945850381396,
"acc_norm": 0.80970625798212,
"acc_norm_stderr": 0.014036945850381396
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7052023121387283,
"acc_stderr": 0.024547617794803828,
"acc_norm": 0.7052023121387283,
"acc_norm_stderr": 0.024547617794803828
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3776536312849162,
"acc_stderr": 0.01621414875213663,
"acc_norm": 0.3776536312849162,
"acc_norm_stderr": 0.01621414875213663
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6928104575163399,
"acc_stderr": 0.026415601914388992,
"acc_norm": 0.6928104575163399,
"acc_norm_stderr": 0.026415601914388992
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.025839898334877983,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.025839898334877983
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.691358024691358,
"acc_stderr": 0.025702640260603742,
"acc_norm": 0.691358024691358,
"acc_norm_stderr": 0.025702640260603742
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.029766675075873862,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.029766675075873862
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44328552803129073,
"acc_stderr": 0.012687818419599924,
"acc_norm": 0.44328552803129073,
"acc_norm_stderr": 0.012687818419599924
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6507352941176471,
"acc_stderr": 0.02895975519682487,
"acc_norm": 0.6507352941176471,
"acc_norm_stderr": 0.02895975519682487
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6650326797385621,
"acc_stderr": 0.019094228167000325,
"acc_norm": 0.6650326797385621,
"acc_norm_stderr": 0.019094228167000325
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.046075820907199756,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.046075820907199756
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.0282638899437846,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.0282638899437846
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5572139303482587,
"acc_stderr": 0.03512310964123937,
"acc_norm": 0.5572139303482587,
"acc_norm_stderr": 0.03512310964123937
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4819277108433735,
"acc_stderr": 0.038899512528272166,
"acc_norm": 0.4819277108433735,
"acc_norm_stderr": 0.038899512528272166
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8538011695906432,
"acc_stderr": 0.027097290118070806,
"acc_norm": 0.8538011695906432,
"acc_norm_stderr": 0.027097290118070806
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4039167686658507,
"mc1_stderr": 0.017177276822584284,
"mc2": 0.5612826178367374,
"mc2_stderr": 0.015986434965174608
},
"harness|winogrande|5": {
"acc": 0.7600631412786109,
"acc_stderr": 0.01200207862948574
},
"harness|gsm8k|5": {
"acc": 0.5481425322213799,
"acc_stderr": 0.013708494995677651
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_kaitchup__Maixtchup-4x7b | [
"region:us"
] | 2024-01-17T16:49:16+00:00 | {"pretty_name": "Evaluation run of kaitchup/Maixtchup-4x7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [kaitchup/Maixtchup-4x7b](https://huggingface.co/kaitchup/Maixtchup-4x7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_kaitchup__Maixtchup-4x7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-17T16:47:01.392242](https://huggingface.co/datasets/open-llm-leaderboard/details_kaitchup__Maixtchup-4x7b/blob/main/results_2024-01-17T16-47-01.392242.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6144719599933052,\n \"acc_stderr\": 0.03303924482918558,\n \"acc_norm\": 0.6168692677516201,\n \"acc_norm_stderr\": 0.03370135211774917,\n \"mc1\": 0.4039167686658507,\n \"mc1_stderr\": 0.017177276822584284,\n \"mc2\": 0.5612826178367374,\n \"mc2_stderr\": 0.015986434965174608\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.590443686006826,\n \"acc_stderr\": 0.014370358632472439,\n \"acc_norm\": 0.6254266211604096,\n \"acc_norm_stderr\": 0.014144193471893454\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6525592511451902,\n \"acc_stderr\": 0.004751840646730854,\n \"acc_norm\": 0.8382792272455686,\n \"acc_norm_stderr\": 0.003674419799353668\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n \"acc_stderr\": 0.04218506215368879,\n \"acc_norm\": 0.6074074074074074,\n \"acc_norm_stderr\": 0.04218506215368879\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6679245283018868,\n \"acc_stderr\": 0.028985455652334395,\n \"acc_norm\": 0.6679245283018868,\n \"acc_norm_stderr\": 0.028985455652334395\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7361111111111112,\n \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.7361111111111112,\n \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6127167630057804,\n \"acc_stderr\": 0.03714325906302065,\n \"acc_norm\": 0.6127167630057804,\n \"acc_norm_stderr\": 0.03714325906302065\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.047840607041056527,\n \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.047840607041056527\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5234042553191489,\n \"acc_stderr\": 0.03265019475033582,\n \"acc_norm\": 0.5234042553191489,\n \"acc_norm_stderr\": 0.03265019475033582\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n \"acc_stderr\": 0.046854730419077895,\n \"acc_norm\": 0.45614035087719296,\n \"acc_norm_stderr\": 0.046854730419077895\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3968253968253968,\n \"acc_stderr\": 0.025197101074246483,\n \"acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.025197101074246483\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n \"acc_stderr\": 0.04343525428949098,\n \"acc_norm\": 0.38095238095238093,\n \"acc_norm_stderr\": 0.04343525428949098\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6064516129032258,\n \"acc_stderr\": 0.027791878753132274,\n \"acc_norm\": 0.6064516129032258,\n \"acc_norm_stderr\": 0.027791878753132274\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009181,\n \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009181\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.02962022787479049,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.02962022787479049\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.844559585492228,\n \"acc_stderr\": 0.02614848346915332,\n \"acc_norm\": 0.844559585492228,\n \"acc_norm_stderr\": 0.02614848346915332\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5871794871794872,\n \"acc_stderr\": 0.024962683564331796,\n \"acc_norm\": 0.5871794871794872,\n \"acc_norm_stderr\": 0.024962683564331796\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131143,\n \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131143\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6218487394957983,\n \"acc_stderr\": 0.03149930577784906,\n \"acc_norm\": 0.6218487394957983,\n \"acc_norm_stderr\": 0.03149930577784906\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.01714985851425095,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.01714985851425095\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.47685185185185186,\n \"acc_stderr\": 0.034063153607115065,\n \"acc_norm\": 0.47685185185185186,\n \"acc_norm_stderr\": 0.034063153607115065\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.029771775228145635,\n \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.029771775228145635\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7848101265822784,\n \"acc_stderr\": 0.026750826994676166,\n \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.026750826994676166\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6412556053811659,\n \"acc_stderr\": 0.032190792004199956,\n \"acc_norm\": 0.6412556053811659,\n \"acc_norm_stderr\": 0.032190792004199956\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.732824427480916,\n \"acc_stderr\": 0.03880848301082393,\n \"acc_norm\": 0.732824427480916,\n \"acc_norm_stderr\": 0.03880848301082393\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.04330043749650743,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.04330043749650743\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7116564417177914,\n \"acc_stderr\": 0.035590395316173425,\n \"acc_norm\": 0.7116564417177914,\n \"acc_norm_stderr\": 0.035590395316173425\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8504273504273504,\n \"acc_stderr\": 0.023365051491753715,\n \"acc_norm\": 0.8504273504273504,\n \"acc_norm_stderr\": 0.023365051491753715\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.80970625798212,\n \"acc_stderr\": 0.014036945850381396,\n \"acc_norm\": 0.80970625798212,\n \"acc_norm_stderr\": 0.014036945850381396\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7052023121387283,\n \"acc_stderr\": 0.024547617794803828,\n \"acc_norm\": 0.7052023121387283,\n \"acc_norm_stderr\": 0.024547617794803828\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3776536312849162,\n \"acc_stderr\": 0.01621414875213663,\n \"acc_norm\": 0.3776536312849162,\n \"acc_norm_stderr\": 0.01621414875213663\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6928104575163399,\n \"acc_stderr\": 0.026415601914388992,\n \"acc_norm\": 0.6928104575163399,\n \"acc_norm_stderr\": 0.026415601914388992\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n \"acc_stderr\": 0.025839898334877983,\n \"acc_norm\": 0.707395498392283,\n \"acc_norm_stderr\": 0.025839898334877983\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.691358024691358,\n \"acc_stderr\": 0.025702640260603742,\n \"acc_norm\": 0.691358024691358,\n \"acc_norm_stderr\": 0.025702640260603742\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.46808510638297873,\n \"acc_stderr\": 0.029766675075873862,\n \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.029766675075873862\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44328552803129073,\n \"acc_stderr\": 0.012687818419599924,\n \"acc_norm\": 0.44328552803129073,\n \"acc_norm_stderr\": 0.012687818419599924\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6507352941176471,\n \"acc_stderr\": 0.02895975519682487,\n \"acc_norm\": 0.6507352941176471,\n \"acc_norm_stderr\": 0.02895975519682487\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6650326797385621,\n \"acc_stderr\": 0.019094228167000325,\n \"acc_norm\": 0.6650326797385621,\n \"acc_norm_stderr\": 0.019094228167000325\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.046075820907199756,\n \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.046075820907199756\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.0282638899437846,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.0282638899437846\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5572139303482587,\n \"acc_stderr\": 0.03512310964123937,\n \"acc_norm\": 0.5572139303482587,\n \"acc_norm_stderr\": 0.03512310964123937\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653694,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653694\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4819277108433735,\n \"acc_stderr\": 0.038899512528272166,\n \"acc_norm\": 0.4819277108433735,\n \"acc_norm_stderr\": 0.038899512528272166\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8538011695906432,\n \"acc_stderr\": 0.027097290118070806,\n \"acc_norm\": 0.8538011695906432,\n \"acc_norm_stderr\": 0.027097290118070806\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4039167686658507,\n \"mc1_stderr\": 0.017177276822584284,\n \"mc2\": 0.5612826178367374,\n \"mc2_stderr\": 0.015986434965174608\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7600631412786109,\n \"acc_stderr\": 0.01200207862948574\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5481425322213799,\n \"acc_stderr\": 0.013708494995677651\n }\n}\n```", "repo_url": "https://huggingface.co/kaitchup/Maixtchup-4x7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_17T16_47_01.392242", "path": ["**/details_harness|arc:challenge|25_2024-01-17T16-47-01.392242.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-17T16-47-01.392242.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_17T16_47_01.392242", "path": ["**/details_harness|gsm8k|5_2024-01-17T16-47-01.392242.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-17T16-47-01.392242.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_17T16_47_01.392242", "path": ["**/details_harness|hellaswag|10_2024-01-17T16-47-01.392242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-17T16-47-01.392242.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_17T16_47_01.392242", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-17T16-47-01.392242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-17T16-47-01.392242.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-17T16-47-01.392242.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_17T16_47_01.392242", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T16-47-01.392242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T16-47-01.392242.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_17T16_47_01.392242", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-17T16-47-01.392242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-17T16-47-01.392242.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_17T16_47_01.392242", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-17T16-47-01.392242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-17T16-47-01.392242.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_17T16_47_01.392242", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T16-47-01.392242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T16-47-01.392242.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_17T16_47_01.392242", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T16-47-01.392242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T16-47-01.392242.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_17T16_47_01.392242", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-17T16-47-01.392242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-17T16-47-01.392242.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_17T16_47_01.392242", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T16-47-01.392242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T16-47-01.392242.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_17T16_47_01.392242", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T16-47-01.392242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T16-47-01.392242.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_17T16_47_01.392242", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T16-47-01.392242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T16-47-01.392242.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_17T16_47_01.392242", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T16-47-01.392242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T16-47-01.392242.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_17T16_47_01.392242", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-17T16-47-01.392242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-17T16-47-01.392242.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_17T16_47_01.392242", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-17T16-47-01.392242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-17T16-47-01.392242.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_17T16_47_01.392242", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T16-47-01.392242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T16-47-01.392242.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_17T16_47_01.392242", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-17T16-47-01.392242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-17T16-47-01.392242.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_17T16_47_01.392242", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T16-47-01.392242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T16-47-01.392242.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_17T16_47_01.392242", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T16-47-01.392242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T16-47-01.392242.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_17T16_47_01.392242", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T16-47-01.392242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T16-47-01.392242.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_17T16_47_01.392242", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-17T16-47-01.392242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-17T16-47-01.392242.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_17T16_47_01.392242", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T16-47-01.392242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T16-47-01.392242.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_17T16_47_01.392242", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T16-47-01.392242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T16-47-01.392242.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_17T16_47_01.392242", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T16-47-01.392242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T16-47-01.392242.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_17T16_47_01.392242", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T16-47-01.392242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T16-47-01.392242.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_17T16_47_01.392242", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T16-47-01.392242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T16-47-01.392242.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_17T16_47_01.392242", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T16-47-01.392242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T16-47-01.392242.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_17T16_47_01.392242", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T16-47-01.392242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T16-47-01.392242.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_17T16_47_01.392242", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T16-47-01.392242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T16-47-01.392242.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_17T16_47_01.392242", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T16-47-01.392242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T16-47-01.392242.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_17T16_47_01.392242", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T16-47-01.392242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T16-47-01.392242.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_17T16_47_01.392242", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T16-47-01.392242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T16-47-01.392242.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_17T16_47_01.392242", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T16-47-01.392242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T16-47-01.392242.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_17T16_47_01.392242", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T16-47-01.392242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T16-47-01.392242.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_17T16_47_01.392242", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T16-47-01.392242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T16-47-01.392242.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_17T16_47_01.392242", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-17T16-47-01.392242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-17T16-47-01.392242.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_17T16_47_01.392242", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T16-47-01.392242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T16-47-01.392242.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_17T16_47_01.392242", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-17T16-47-01.392242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-17T16-47-01.392242.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_17T16_47_01.392242", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T16-47-01.392242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T16-47-01.392242.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_17T16_47_01.392242", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T16-47-01.392242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T16-47-01.392242.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_17T16_47_01.392242", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T16-47-01.392242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T16-47-01.392242.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_17T16_47_01.392242", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-17T16-47-01.392242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-17T16-47-01.392242.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_17T16_47_01.392242", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-17T16-47-01.392242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-17T16-47-01.392242.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_17T16_47_01.392242", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T16-47-01.392242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T16-47-01.392242.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_17T16_47_01.392242", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T16-47-01.392242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T16-47-01.392242.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_17T16_47_01.392242", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T16-47-01.392242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T16-47-01.392242.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_17T16_47_01.392242", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T16-47-01.392242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T16-47-01.392242.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_17T16_47_01.392242", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-17T16-47-01.392242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-17T16-47-01.392242.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_17T16_47_01.392242", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-17T16-47-01.392242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-17T16-47-01.392242.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_17T16_47_01.392242", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-17T16-47-01.392242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-17T16-47-01.392242.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_17T16_47_01.392242", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T16-47-01.392242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T16-47-01.392242.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_17T16_47_01.392242", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-17T16-47-01.392242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-17T16-47-01.392242.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_17T16_47_01.392242", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T16-47-01.392242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T16-47-01.392242.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_17T16_47_01.392242", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T16-47-01.392242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T16-47-01.392242.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_17T16_47_01.392242", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-17T16-47-01.392242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-17T16-47-01.392242.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_17T16_47_01.392242", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-17T16-47-01.392242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-17T16-47-01.392242.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_17T16_47_01.392242", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-17T16-47-01.392242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-17T16-47-01.392242.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_17T16_47_01.392242", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T16-47-01.392242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T16-47-01.392242.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_17T16_47_01.392242", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-17T16-47-01.392242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-17T16-47-01.392242.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_17T16_47_01.392242", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-17T16-47-01.392242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-17T16-47-01.392242.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_17T16_47_01.392242", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-17T16-47-01.392242.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-17T16-47-01.392242.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_17T16_47_01.392242", "path": ["**/details_harness|winogrande|5_2024-01-17T16-47-01.392242.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-17T16-47-01.392242.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_17T16_47_01.392242", "path": ["results_2024-01-17T16-47-01.392242.parquet"]}, {"split": "latest", "path": ["results_2024-01-17T16-47-01.392242.parquet"]}]}]} | 2024-01-17T16:49:39+00:00 |
a24f6e351a03d4c2cf01c83e752d1d5c8eb0676a |
# Dataset of ophelia (Fire Emblem)
This is the dataset of ophelia (Fire Emblem), containing 500 images and their tags.
The core tags of this character are `blonde_hair, long_hair, breasts, ahoge, grey_eyes, bangs, medium_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 623.85 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ophelia_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 344.48 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ophelia_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1160 | 730.28 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ophelia_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 543.34 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ophelia_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1160 | 1.02 GiB | [Download](https://huggingface.co/datasets/CyberHarem/ophelia_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/ophelia_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 17 |  |  |  |  |  | 1girl, cape, circlet, solo, closed_mouth, smile, upper_body, looking_at_viewer, cleavage, bodystocking, covered_navel, blue_eyes, turtleneck, center_opening |
| 1 | 15 |  |  |  |  |  | 1girl, cape, circlet, looking_at_viewer, solo, turtleneck, bodystocking, upper_body, covered_navel, open_mouth, one_eye_closed, asymmetrical_bangs, bridal_gauntlets, cleavage, smile, blue_eyes |
| 2 | 8 |  |  |  |  |  | 1girl, circlet, looking_at_viewer, smile, solo, asymmetrical_bangs, official_alternate_costume, upper_body, closed_mouth, turtleneck, bodystocking, cleavage |
| 3 | 9 |  |  |  |  |  | 1girl, completely_nude, looking_at_viewer, nipples, solo, smile, closed_mouth, navel, pussy, blush, large_breasts, barefoot, blue_eyes, circlet |
| 4 | 14 |  |  |  |  |  | 1boy, 1girl, hetero, nipples, open_mouth, penis, sex, blush, vaginal, circlet, completely_nude, cum_in_pussy, uncensored, large_breasts, navel, spread_legs |
| 5 | 10 |  |  |  |  |  | 1boy, 1girl, circlet, hetero, penis, solo_focus, large_breasts, looking_at_viewer, nipples, blush, paizuri, censored, pov, cum_on_breasts, smile, upper_body |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | cape | circlet | solo | closed_mouth | smile | upper_body | looking_at_viewer | cleavage | bodystocking | covered_navel | blue_eyes | turtleneck | center_opening | open_mouth | one_eye_closed | asymmetrical_bangs | bridal_gauntlets | official_alternate_costume | completely_nude | nipples | navel | pussy | blush | large_breasts | barefoot | 1boy | hetero | penis | sex | vaginal | cum_in_pussy | uncensored | spread_legs | solo_focus | paizuri | censored | pov | cum_on_breasts |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:----------|:-------|:---------------|:--------|:-------------|:--------------------|:-----------|:---------------|:----------------|:------------|:-------------|:-----------------|:-------------|:-----------------|:---------------------|:-------------------|:-----------------------------|:------------------|:----------|:--------|:--------|:--------|:----------------|:-----------|:-------|:---------|:--------|:------|:----------|:---------------|:-------------|:--------------|:-------------|:----------|:-----------|:------|:-----------------|
| 0 | 17 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 15 |  |  |  |  |  | X | X | X | X | | X | X | X | X | X | X | X | X | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | |
| 2 | 8 |  |  |  |  |  | X | | X | X | X | X | X | X | X | X | | | X | | | | X | | X | | | | | | | | | | | | | | | | | | | | |
| 3 | 9 |  |  |  |  |  | X | | X | X | X | X | | X | | | | X | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | |
| 4 | 14 |  |  |  |  |  | X | | X | | | | | | | | | | | | X | | | | | X | X | X | | X | X | | X | X | X | X | X | X | X | X | | | | | |
| 5 | 10 |  |  |  |  |  | X | | X | | | X | X | X | | | | | | | | | | | | | X | | | X | X | | X | X | X | | | | | | X | X | X | X | X |
| CyberHarem/ophelia_fireemblem | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-17T16:51:13+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-17T18:41:57+00:00 |
c845e054745df6b928c4a6d2e263f622f509cd19 |
# Dataset of charlotte (Fire Emblem)
This is the dataset of charlotte (Fire Emblem), containing 285 images and their tags.
The core tags of this character are `blonde_hair, long_hair, breasts, bow, blue_eyes, hair_bow, large_breasts, bangs, white_bow`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 285 | 304.23 MiB | [Download](https://huggingface.co/datasets/CyberHarem/charlotte_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 285 | 183.26 MiB | [Download](https://huggingface.co/datasets/CyberHarem/charlotte_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 665 | 377.56 MiB | [Download](https://huggingface.co/datasets/CyberHarem/charlotte_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 285 | 275.84 MiB | [Download](https://huggingface.co/datasets/CyberHarem/charlotte_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 665 | 516.68 MiB | [Download](https://huggingface.co/datasets/CyberHarem/charlotte_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/charlotte_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 26 |  |  |  |  |  | 1girl, hetero, solo_focus, 1boy, penis, pussy, nipples, blush, uncensored, vaginal, nude, open_mouth, navel, sex_from_behind, spread_legs, testicles |
| 1 | 18 |  |  |  |  |  | 1boy, 1girl, hetero, solo_focus, penis, nipples, paizuri, blush, nude, cum_on_breasts, smile, facial, uncensored |
| 2 | 20 |  |  |  |  |  | 1girl, cleavage, solo, smile, navel, midriff, looking_at_viewer, shoulder_armor, bikini_armor, white_background, simple_background, spikes |
| 3 | 11 |  |  |  |  |  | 1girl, solo, huge_breasts, looking_at_viewer, smile, thick_thighs, blush, blunt_bangs, short_shorts, ass, cleavage, simple_background, thighhighs, blue_background, denim_shorts, from_behind, looking_back |
| 4 | 10 |  |  |  |  |  | 1girl, huge_penis, solo, uncensored, blush, erection, nipples, nude, open_mouth, testicles, navel, heart, animal_penis, blunt_bangs, ejaculation, futanari_masturbation, rolling_eyes, spread_legs, thighhighs, tongue, veiny_penis |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | hetero | solo_focus | 1boy | penis | pussy | nipples | blush | uncensored | vaginal | nude | open_mouth | navel | sex_from_behind | spread_legs | testicles | paizuri | cum_on_breasts | smile | facial | cleavage | solo | midriff | looking_at_viewer | shoulder_armor | bikini_armor | white_background | simple_background | spikes | huge_breasts | thick_thighs | blunt_bangs | short_shorts | ass | thighhighs | blue_background | denim_shorts | from_behind | looking_back | huge_penis | erection | heart | animal_penis | ejaculation | futanari_masturbation | rolling_eyes | tongue | veiny_penis |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------|:-------------|:-------|:--------|:--------|:----------|:--------|:-------------|:----------|:-------|:-------------|:--------|:------------------|:--------------|:------------|:----------|:-----------------|:--------|:---------|:-----------|:-------|:----------|:--------------------|:-----------------|:---------------|:-------------------|:--------------------|:---------|:---------------|:---------------|:--------------|:---------------|:------|:-------------|:------------------|:---------------|:--------------|:---------------|:-------------|:-----------|:--------|:---------------|:--------------|:------------------------|:---------------|:---------|:--------------|
| 0 | 26 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 18 |  |  |  |  |  | X | X | X | X | X | | X | X | X | | X | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 20 |  |  |  |  |  | X | | | | | | | | | | | | X | | | | | | X | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | |
| 3 | 11 |  |  |  |  |  | X | | | | | | | X | | | | | | | | | | | X | | X | X | | X | | | | X | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | |
| 4 | 10 |  |  |  |  |  | X | | | | | | X | X | X | | X | X | X | | X | X | | | | | | X | | | | | | | | | | X | | | X | | | | | X | X | X | X | X | X | X | X | X |
| CyberHarem/charlotte_fireemblem | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-17T16:51:13+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-17T17:45:07+00:00 |
dd909a5d2ec2c133b7cce62ef0c48d37693b6de0 |
# Dataset of serge (Fire Emblem)
This is the dataset of serge (Fire Emblem), containing 151 images and their tags.
The core tags of this character are `long_hair, breasts, red_hair, red_eyes, hairband, large_breasts, pink_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 151 | 164.75 MiB | [Download](https://huggingface.co/datasets/CyberHarem/serge_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 151 | 100.59 MiB | [Download](https://huggingface.co/datasets/CyberHarem/serge_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 325 | 192.24 MiB | [Download](https://huggingface.co/datasets/CyberHarem/serge_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 151 | 147.98 MiB | [Download](https://huggingface.co/datasets/CyberHarem/serge_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 325 | 259.09 MiB | [Download](https://huggingface.co/datasets/CyberHarem/serge_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/serge_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 12 |  |  |  |  |  | 1girl, ninja, solo, bangs, fingerless_gloves, looking_at_viewer, simple_background, cleavage, medium_breasts, obi, pink_eyes, smile, white_background, white_scarf, black_gloves, holding, arm_guards, elbow_gloves, black_thighhighs, closed_mouth, official_alternate_costume, sheathed, short_sword, sleeveless_kimono, very_long_hair |
| 1 | 17 |  |  |  |  |  | 1girl, smile, armor, solo, looking_at_viewer, simple_background, gloves, closed_mouth, upper_body, white_background |
| 2 | 12 |  |  |  |  |  | 1girl, from_behind, solo, looking_back, back_cutout, gloves, looking_at_viewer, smile, axe, holding_weapon, backless_dress, boots, shoulder_armor, simple_background, thighhighs |
| 3 | 6 |  |  |  |  |  | 1boy, 1girl, hetero, navel, nipples, blush, penis, sex, solo_focus, uncensored, clitoris, smile, sweat, vaginal, artist_name, lying, pussy_juice, spread_legs |
| 4 | 8 |  |  |  |  |  | 1boy, 1girl, hetero, penis, solo_focus, uncensored, blush, looking_at_viewer, nipples, ass, sweat, completely_nude, girl_on_top, looking_back, pussy, smile, anus, cum, pink_eyes, reverse_cowgirl_position, sex_from_behind, vaginal, artist_name, bangs, open_mouth, pov, simple_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | ninja | solo | bangs | fingerless_gloves | looking_at_viewer | simple_background | cleavage | medium_breasts | obi | pink_eyes | smile | white_background | white_scarf | black_gloves | holding | arm_guards | elbow_gloves | black_thighhighs | closed_mouth | official_alternate_costume | sheathed | short_sword | sleeveless_kimono | very_long_hair | armor | gloves | upper_body | from_behind | looking_back | back_cutout | axe | holding_weapon | backless_dress | boots | shoulder_armor | thighhighs | 1boy | hetero | navel | nipples | blush | penis | sex | solo_focus | uncensored | clitoris | sweat | vaginal | artist_name | lying | pussy_juice | spread_legs | ass | completely_nude | girl_on_top | pussy | anus | cum | reverse_cowgirl_position | sex_from_behind | open_mouth | pov |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:-------|:--------|:--------------------|:--------------------|:--------------------|:-----------|:-----------------|:------|:------------|:--------|:-------------------|:--------------|:---------------|:----------|:-------------|:---------------|:-------------------|:---------------|:-----------------------------|:-----------|:--------------|:--------------------|:-----------------|:--------|:---------|:-------------|:--------------|:---------------|:--------------|:------|:-----------------|:-----------------|:--------|:-----------------|:-------------|:-------|:---------|:--------|:----------|:--------|:--------|:------|:-------------|:-------------|:-----------|:--------|:----------|:--------------|:--------|:--------------|:--------------|:------|:------------------|:--------------|:--------|:-------|:------|:---------------------------|:------------------|:-------------|:------|
| 0 | 12 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 17 |  |  |  |  |  | X | | X | | | X | X | | | | | X | X | | | | | | | X | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 12 |  |  |  |  |  | X | | X | | | X | X | | | | | X | | | | | | | | | | | | | | | X | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 6 |  |  |  |  |  | X | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | |
| 4 | 8 |  |  |  |  |  | X | | | X | | X | X | | | | X | X | | | | | | | | | | | | | | | | | | X | | | | | | | | X | X | | X | X | X | | X | X | | X | X | X | | | | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/serge_fireemblem | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-17T16:51:16+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-17T17:24:17+00:00 |
f3714cfd498cdeed07e0dcd41f7d99069fda71b3 |
# Dataset Card for Evaluation run of llmixer/BigWeave-v20-110b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [llmixer/BigWeave-v20-110b](https://huggingface.co/llmixer/BigWeave-v20-110b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_llmixer__BigWeave-v20-110b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-16T10:41:33.075058](https://huggingface.co/datasets/open-llm-leaderboard/details_llmixer__BigWeave-v20-110b/blob/main/results_2024-02-16T10-41-33.075058.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7006598594902614,
"acc_stderr": 0.030334879953044784,
"acc_norm": 0.7077385502169132,
"acc_norm_stderr": 0.03092785311503184,
"mc1": 0.44920440636474906,
"mc1_stderr": 0.017412941986115305,
"mc2": 0.6247452534043703,
"mc2_stderr": 0.01525624326187566
},
"harness|arc:challenge|25": {
"acc": 0.6484641638225256,
"acc_stderr": 0.013952413699600935,
"acc_norm": 0.681740614334471,
"acc_norm_stderr": 0.013611993916971453
},
"harness|hellaswag|10": {
"acc": 0.7175861382194781,
"acc_stderr": 0.004492535748097629,
"acc_norm": 0.885381398127863,
"acc_norm_stderr": 0.0031791005658879977
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939098,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939098
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353228,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353228
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7894736842105263,
"acc_stderr": 0.03317672787533157,
"acc_norm": 0.7894736842105263,
"acc_norm_stderr": 0.03317672787533157
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768081,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768081
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7283018867924528,
"acc_stderr": 0.027377706624670716,
"acc_norm": 0.7283018867924528,
"acc_norm_stderr": 0.027377706624670716
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8541666666666666,
"acc_stderr": 0.02951424596429177,
"acc_norm": 0.8541666666666666,
"acc_norm_stderr": 0.02951424596429177
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.035331333893236574,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.035331333893236574
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6893617021276596,
"acc_stderr": 0.03025123757921317,
"acc_norm": 0.6893617021276596,
"acc_norm_stderr": 0.03025123757921317
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5263157894736842,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.5263157894736842,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6344827586206897,
"acc_stderr": 0.04013124195424386,
"acc_norm": 0.6344827586206897,
"acc_norm_stderr": 0.04013124195424386
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.025680564640056882,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.025680564640056882
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8516129032258064,
"acc_stderr": 0.020222737554330385,
"acc_norm": 0.8516129032258064,
"acc_norm_stderr": 0.020222737554330385
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5566502463054187,
"acc_stderr": 0.03495334582162933,
"acc_norm": 0.5566502463054187,
"acc_norm_stderr": 0.03495334582162933
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8303030303030303,
"acc_stderr": 0.029311188674983137,
"acc_norm": 0.8303030303030303,
"acc_norm_stderr": 0.029311188674983137
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8838383838383839,
"acc_stderr": 0.022828881775249377,
"acc_norm": 0.8838383838383839,
"acc_norm_stderr": 0.022828881775249377
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9222797927461139,
"acc_stderr": 0.019321805557223157,
"acc_norm": 0.9222797927461139,
"acc_norm_stderr": 0.019321805557223157
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7128205128205128,
"acc_stderr": 0.022939925418530616,
"acc_norm": 0.7128205128205128,
"acc_norm_stderr": 0.022939925418530616
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.028972648884844267,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.028972648884844267
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8025210084033614,
"acc_stderr": 0.025859164122051456,
"acc_norm": 0.8025210084033614,
"acc_norm_stderr": 0.025859164122051456
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5099337748344371,
"acc_stderr": 0.04081677107248437,
"acc_norm": 0.5099337748344371,
"acc_norm_stderr": 0.04081677107248437
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8844036697247707,
"acc_stderr": 0.01370874953417264,
"acc_norm": 0.8844036697247707,
"acc_norm_stderr": 0.01370874953417264
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.625,
"acc_stderr": 0.033016908987210894,
"acc_norm": 0.625,
"acc_norm_stderr": 0.033016908987210894
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9068627450980392,
"acc_stderr": 0.020397853969426998,
"acc_norm": 0.9068627450980392,
"acc_norm_stderr": 0.020397853969426998
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8945147679324894,
"acc_stderr": 0.019995560723758556,
"acc_norm": 0.8945147679324894,
"acc_norm_stderr": 0.019995560723758556
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7757847533632287,
"acc_stderr": 0.027991534258519524,
"acc_norm": 0.7757847533632287,
"acc_norm_stderr": 0.027991534258519524
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8702290076335878,
"acc_stderr": 0.029473649496907065,
"acc_norm": 0.8702290076335878,
"acc_norm_stderr": 0.029473649496907065
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8677685950413223,
"acc_stderr": 0.03092278832044579,
"acc_norm": 0.8677685950413223,
"acc_norm_stderr": 0.03092278832044579
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037181,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037181
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8159509202453987,
"acc_stderr": 0.030446777687971726,
"acc_norm": 0.8159509202453987,
"acc_norm_stderr": 0.030446777687971726
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6160714285714286,
"acc_stderr": 0.04616143075028546,
"acc_norm": 0.6160714285714286,
"acc_norm_stderr": 0.04616143075028546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9017094017094017,
"acc_stderr": 0.019503444900757567,
"acc_norm": 0.9017094017094017,
"acc_norm_stderr": 0.019503444900757567
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8646232439335888,
"acc_stderr": 0.012234384586856488,
"acc_norm": 0.8646232439335888,
"acc_norm_stderr": 0.012234384586856488
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8121387283236994,
"acc_stderr": 0.021029269752423217,
"acc_norm": 0.8121387283236994,
"acc_norm_stderr": 0.021029269752423217
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.40558659217877097,
"acc_stderr": 0.016421670506339185,
"acc_norm": 0.40558659217877097,
"acc_norm_stderr": 0.016421670506339185
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.023805186524888156,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.023805186524888156
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7942122186495176,
"acc_stderr": 0.022961339906764244,
"acc_norm": 0.7942122186495176,
"acc_norm_stderr": 0.022961339906764244
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8364197530864198,
"acc_stderr": 0.020581466138257117,
"acc_norm": 0.8364197530864198,
"acc_norm_stderr": 0.020581466138257117
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5531914893617021,
"acc_stderr": 0.02965823509766691,
"acc_norm": 0.5531914893617021,
"acc_norm_stderr": 0.02965823509766691
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5658409387222947,
"acc_stderr": 0.012659033237067253,
"acc_norm": 0.5658409387222947,
"acc_norm_stderr": 0.012659033237067253
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.75,
"acc_stderr": 0.026303648393696036,
"acc_norm": 0.75,
"acc_norm_stderr": 0.026303648393696036
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7728758169934641,
"acc_stderr": 0.016949853279212373,
"acc_norm": 0.7728758169934641,
"acc_norm_stderr": 0.016949853279212373
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8163265306122449,
"acc_stderr": 0.024789071332007636,
"acc_norm": 0.8163265306122449,
"acc_norm_stderr": 0.024789071332007636
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8905472636815921,
"acc_stderr": 0.022076326101824667,
"acc_norm": 0.8905472636815921,
"acc_norm_stderr": 0.022076326101824667
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.028762349126466125,
"acc_norm": 0.91,
"acc_norm_stderr": 0.028762349126466125
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8538011695906432,
"acc_stderr": 0.027097290118070806,
"acc_norm": 0.8538011695906432,
"acc_norm_stderr": 0.027097290118070806
},
"harness|truthfulqa:mc|0": {
"mc1": 0.44920440636474906,
"mc1_stderr": 0.017412941986115305,
"mc2": 0.6247452534043703,
"mc2_stderr": 0.01525624326187566
},
"harness|winogrande|5": {
"acc": 0.8208366219415943,
"acc_stderr": 0.010777949156047987
},
"harness|gsm8k|5": {
"acc": 0.3639120545868082,
"acc_stderr": 0.013252539227966193
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_llmixer__BigWeave-v20-110b | [
"region:us"
] | 2024-02-16T10:43:51+00:00 | {"pretty_name": "Evaluation run of llmixer/BigWeave-v20-110b", "dataset_summary": "Dataset automatically created during the evaluation run of model [llmixer/BigWeave-v20-110b](https://huggingface.co/llmixer/BigWeave-v20-110b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_llmixer__BigWeave-v20-110b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-16T10:41:33.075058](https://huggingface.co/datasets/open-llm-leaderboard/details_llmixer__BigWeave-v20-110b/blob/main/results_2024-02-16T10-41-33.075058.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7006598594902614,\n \"acc_stderr\": 0.030334879953044784,\n \"acc_norm\": 0.7077385502169132,\n \"acc_norm_stderr\": 0.03092785311503184,\n \"mc1\": 0.44920440636474906,\n \"mc1_stderr\": 0.017412941986115305,\n \"mc2\": 0.6247452534043703,\n \"mc2_stderr\": 0.01525624326187566\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6484641638225256,\n \"acc_stderr\": 0.013952413699600935,\n \"acc_norm\": 0.681740614334471,\n \"acc_norm_stderr\": 0.013611993916971453\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7175861382194781,\n \"acc_stderr\": 0.004492535748097629,\n \"acc_norm\": 0.885381398127863,\n \"acc_norm_stderr\": 0.0031791005658879977\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939098,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939098\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n \"acc_stderr\": 0.04244633238353228,\n \"acc_norm\": 0.5925925925925926,\n \"acc_norm_stderr\": 0.04244633238353228\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.03317672787533157,\n \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.03317672787533157\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768081,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768081\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7283018867924528,\n \"acc_stderr\": 0.027377706624670716,\n \"acc_norm\": 0.7283018867924528,\n \"acc_norm_stderr\": 0.027377706624670716\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8541666666666666,\n \"acc_stderr\": 0.02951424596429177,\n \"acc_norm\": 0.8541666666666666,\n \"acc_norm_stderr\": 0.02951424596429177\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6878612716763006,\n \"acc_stderr\": 0.035331333893236574,\n \"acc_norm\": 0.6878612716763006,\n \"acc_norm_stderr\": 0.035331333893236574\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6893617021276596,\n \"acc_stderr\": 0.03025123757921317,\n \"acc_norm\": 0.6893617021276596,\n \"acc_norm_stderr\": 0.03025123757921317\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5263157894736842,\n \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.5263157894736842,\n \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6344827586206897,\n \"acc_stderr\": 0.04013124195424386,\n \"acc_norm\": 0.6344827586206897,\n \"acc_norm_stderr\": 0.04013124195424386\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.46296296296296297,\n \"acc_stderr\": 0.025680564640056882,\n \"acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.025680564640056882\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8516129032258064,\n \"acc_stderr\": 0.020222737554330385,\n \"acc_norm\": 0.8516129032258064,\n \"acc_norm_stderr\": 0.020222737554330385\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5566502463054187,\n \"acc_stderr\": 0.03495334582162933,\n \"acc_norm\": 0.5566502463054187,\n \"acc_norm_stderr\": 0.03495334582162933\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8303030303030303,\n \"acc_stderr\": 0.029311188674983137,\n \"acc_norm\": 0.8303030303030303,\n \"acc_norm_stderr\": 0.029311188674983137\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8838383838383839,\n \"acc_stderr\": 0.022828881775249377,\n \"acc_norm\": 0.8838383838383839,\n \"acc_norm_stderr\": 0.022828881775249377\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9222797927461139,\n \"acc_stderr\": 0.019321805557223157,\n \"acc_norm\": 0.9222797927461139,\n \"acc_norm_stderr\": 0.019321805557223157\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.7128205128205128,\n \"acc_stderr\": 0.022939925418530616,\n \"acc_norm\": 0.7128205128205128,\n \"acc_norm_stderr\": 0.022939925418530616\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34444444444444444,\n \"acc_stderr\": 0.028972648884844267,\n \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.028972648884844267\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8025210084033614,\n \"acc_stderr\": 0.025859164122051456,\n \"acc_norm\": 0.8025210084033614,\n \"acc_norm_stderr\": 0.025859164122051456\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.5099337748344371,\n \"acc_stderr\": 0.04081677107248437,\n \"acc_norm\": 0.5099337748344371,\n \"acc_norm_stderr\": 0.04081677107248437\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8844036697247707,\n \"acc_stderr\": 0.01370874953417264,\n \"acc_norm\": 0.8844036697247707,\n \"acc_norm_stderr\": 0.01370874953417264\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.625,\n \"acc_stderr\": 0.033016908987210894,\n \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.033016908987210894\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9068627450980392,\n \"acc_stderr\": 0.020397853969426998,\n \"acc_norm\": 0.9068627450980392,\n \"acc_norm_stderr\": 0.020397853969426998\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8945147679324894,\n \"acc_stderr\": 0.019995560723758556,\n \"acc_norm\": 0.8945147679324894,\n \"acc_norm_stderr\": 0.019995560723758556\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7757847533632287,\n \"acc_stderr\": 0.027991534258519524,\n \"acc_norm\": 0.7757847533632287,\n \"acc_norm_stderr\": 0.027991534258519524\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8702290076335878,\n \"acc_stderr\": 0.029473649496907065,\n \"acc_norm\": 0.8702290076335878,\n \"acc_norm_stderr\": 0.029473649496907065\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8677685950413223,\n \"acc_stderr\": 0.03092278832044579,\n \"acc_norm\": 0.8677685950413223,\n \"acc_norm_stderr\": 0.03092278832044579\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n \"acc_stderr\": 0.03755265865037181,\n \"acc_norm\": 0.8148148148148148,\n \"acc_norm_stderr\": 0.03755265865037181\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8159509202453987,\n \"acc_stderr\": 0.030446777687971726,\n \"acc_norm\": 0.8159509202453987,\n \"acc_norm_stderr\": 0.030446777687971726\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6160714285714286,\n \"acc_stderr\": 0.04616143075028546,\n \"acc_norm\": 0.6160714285714286,\n \"acc_norm_stderr\": 0.04616143075028546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9017094017094017,\n \"acc_stderr\": 0.019503444900757567,\n \"acc_norm\": 0.9017094017094017,\n \"acc_norm_stderr\": 0.019503444900757567\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8646232439335888,\n \"acc_stderr\": 0.012234384586856488,\n \"acc_norm\": 0.8646232439335888,\n \"acc_norm_stderr\": 0.012234384586856488\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8121387283236994,\n \"acc_stderr\": 0.021029269752423217,\n \"acc_norm\": 0.8121387283236994,\n \"acc_norm_stderr\": 0.021029269752423217\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.40558659217877097,\n \"acc_stderr\": 0.016421670506339185,\n \"acc_norm\": 0.40558659217877097,\n \"acc_norm_stderr\": 0.016421670506339185\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.023805186524888156,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.023805186524888156\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7942122186495176,\n \"acc_stderr\": 0.022961339906764244,\n \"acc_norm\": 0.7942122186495176,\n \"acc_norm_stderr\": 0.022961339906764244\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8364197530864198,\n \"acc_stderr\": 0.020581466138257117,\n \"acc_norm\": 0.8364197530864198,\n \"acc_norm_stderr\": 0.020581466138257117\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5531914893617021,\n \"acc_stderr\": 0.02965823509766691,\n \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.02965823509766691\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5658409387222947,\n \"acc_stderr\": 0.012659033237067253,\n \"acc_norm\": 0.5658409387222947,\n \"acc_norm_stderr\": 0.012659033237067253\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.026303648393696036,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.026303648393696036\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.7728758169934641,\n \"acc_stderr\": 0.016949853279212373,\n \"acc_norm\": 0.7728758169934641,\n \"acc_norm_stderr\": 0.016949853279212373\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8163265306122449,\n \"acc_stderr\": 0.024789071332007636,\n \"acc_norm\": 0.8163265306122449,\n \"acc_norm_stderr\": 0.024789071332007636\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8905472636815921,\n \"acc_stderr\": 0.022076326101824667,\n \"acc_norm\": 0.8905472636815921,\n \"acc_norm_stderr\": 0.022076326101824667\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466125,\n \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466125\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8538011695906432,\n \"acc_stderr\": 0.027097290118070806,\n \"acc_norm\": 0.8538011695906432,\n \"acc_norm_stderr\": 0.027097290118070806\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.44920440636474906,\n \"mc1_stderr\": 0.017412941986115305,\n \"mc2\": 0.6247452534043703,\n \"mc2_stderr\": 0.01525624326187566\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8208366219415943,\n \"acc_stderr\": 0.010777949156047987\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3639120545868082,\n \"acc_stderr\": 0.013252539227966193\n }\n}\n```", "repo_url": "https://huggingface.co/llmixer/BigWeave-v20-110b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_16T10_41_33.075058", "path": ["**/details_harness|arc:challenge|25_2024-02-16T10-41-33.075058.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-16T10-41-33.075058.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_16T10_41_33.075058", "path": ["**/details_harness|gsm8k|5_2024-02-16T10-41-33.075058.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-16T10-41-33.075058.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_16T10_41_33.075058", "path": ["**/details_harness|hellaswag|10_2024-02-16T10-41-33.075058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-16T10-41-33.075058.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_16T10_41_33.075058", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-16T10-41-33.075058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-16T10-41-33.075058.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-16T10-41-33.075058.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_16T10_41_33.075058", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T10-41-33.075058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T10-41-33.075058.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_16T10_41_33.075058", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-16T10-41-33.075058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-16T10-41-33.075058.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_16T10_41_33.075058", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-16T10-41-33.075058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-16T10-41-33.075058.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_16T10_41_33.075058", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T10-41-33.075058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T10-41-33.075058.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_16T10_41_33.075058", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T10-41-33.075058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T10-41-33.075058.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_16T10_41_33.075058", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-16T10-41-33.075058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-16T10-41-33.075058.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_16T10_41_33.075058", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T10-41-33.075058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T10-41-33.075058.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_16T10_41_33.075058", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T10-41-33.075058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T10-41-33.075058.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_16T10_41_33.075058", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T10-41-33.075058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T10-41-33.075058.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_16T10_41_33.075058", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T10-41-33.075058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T10-41-33.075058.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_16T10_41_33.075058", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-16T10-41-33.075058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-16T10-41-33.075058.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_16T10_41_33.075058", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-16T10-41-33.075058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-16T10-41-33.075058.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_16T10_41_33.075058", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T10-41-33.075058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T10-41-33.075058.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_16T10_41_33.075058", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-16T10-41-33.075058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-16T10-41-33.075058.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_16T10_41_33.075058", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T10-41-33.075058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T10-41-33.075058.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_16T10_41_33.075058", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T10-41-33.075058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T10-41-33.075058.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_16T10_41_33.075058", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T10-41-33.075058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T10-41-33.075058.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_16T10_41_33.075058", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-16T10-41-33.075058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-16T10-41-33.075058.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_16T10_41_33.075058", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T10-41-33.075058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T10-41-33.075058.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_16T10_41_33.075058", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T10-41-33.075058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T10-41-33.075058.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_16T10_41_33.075058", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T10-41-33.075058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T10-41-33.075058.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_16T10_41_33.075058", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T10-41-33.075058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T10-41-33.075058.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_16T10_41_33.075058", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T10-41-33.075058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T10-41-33.075058.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_16T10_41_33.075058", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T10-41-33.075058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T10-41-33.075058.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_16T10_41_33.075058", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T10-41-33.075058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T10-41-33.075058.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_16T10_41_33.075058", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T10-41-33.075058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T10-41-33.075058.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_16T10_41_33.075058", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T10-41-33.075058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T10-41-33.075058.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_16T10_41_33.075058", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T10-41-33.075058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T10-41-33.075058.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_16T10_41_33.075058", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T10-41-33.075058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T10-41-33.075058.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_16T10_41_33.075058", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T10-41-33.075058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T10-41-33.075058.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_16T10_41_33.075058", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T10-41-33.075058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T10-41-33.075058.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_16T10_41_33.075058", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T10-41-33.075058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T10-41-33.075058.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_16T10_41_33.075058", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-16T10-41-33.075058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-16T10-41-33.075058.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_16T10_41_33.075058", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T10-41-33.075058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T10-41-33.075058.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_16T10_41_33.075058", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-16T10-41-33.075058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-16T10-41-33.075058.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_16T10_41_33.075058", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T10-41-33.075058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T10-41-33.075058.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_16T10_41_33.075058", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T10-41-33.075058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T10-41-33.075058.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_16T10_41_33.075058", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T10-41-33.075058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T10-41-33.075058.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_16T10_41_33.075058", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-16T10-41-33.075058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-16T10-41-33.075058.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_16T10_41_33.075058", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-16T10-41-33.075058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-16T10-41-33.075058.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_16T10_41_33.075058", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T10-41-33.075058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T10-41-33.075058.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_16T10_41_33.075058", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T10-41-33.075058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T10-41-33.075058.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_16T10_41_33.075058", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T10-41-33.075058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T10-41-33.075058.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_16T10_41_33.075058", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T10-41-33.075058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T10-41-33.075058.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_16T10_41_33.075058", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-16T10-41-33.075058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-16T10-41-33.075058.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_16T10_41_33.075058", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-16T10-41-33.075058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-16T10-41-33.075058.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_16T10_41_33.075058", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-16T10-41-33.075058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-16T10-41-33.075058.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_16T10_41_33.075058", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T10-41-33.075058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T10-41-33.075058.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_16T10_41_33.075058", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-16T10-41-33.075058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-16T10-41-33.075058.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_16T10_41_33.075058", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T10-41-33.075058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T10-41-33.075058.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_16T10_41_33.075058", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T10-41-33.075058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T10-41-33.075058.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_16T10_41_33.075058", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-16T10-41-33.075058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-16T10-41-33.075058.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_16T10_41_33.075058", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-16T10-41-33.075058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-16T10-41-33.075058.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_16T10_41_33.075058", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-16T10-41-33.075058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-16T10-41-33.075058.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_16T10_41_33.075058", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T10-41-33.075058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T10-41-33.075058.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_16T10_41_33.075058", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-16T10-41-33.075058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-16T10-41-33.075058.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_16T10_41_33.075058", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-16T10-41-33.075058.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-16T10-41-33.075058.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_16T10_41_33.075058", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-16T10-41-33.075058.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-16T10-41-33.075058.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_16T10_41_33.075058", "path": ["**/details_harness|winogrande|5_2024-02-16T10-41-33.075058.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-16T10-41-33.075058.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_16T10_41_33.075058", "path": ["results_2024-02-16T10-41-33.075058.parquet"]}, {"split": "latest", "path": ["results_2024-02-16T10-41-33.075058.parquet"]}]}]} | 2024-02-16T10:44:12+00:00 |
009247e1d352969c780155ccb816c1c53df206fb | suthawadee/receipt_th | [
"region:us"
] | 2024-02-16T10:46:20+00:00 | {"dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "ground_truth", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 14904747.0, "num_examples": 80}, {"name": "validation", "num_bytes": 1625914.0, "num_examples": 10}, {"name": "test", "num_bytes": 1713249.0, "num_examples": 10}], "download_size": 18146419, "dataset_size": 18243910.0}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2024-02-16T10:46:55+00:00 |
|
65292ca3a2e2b313e6fe8379c28bd8e32f42efa5 | Recag/Rp_CommonC_663_1 | [
"region:us"
] | 2024-02-16T10:48:20+00:00 | {} | 2024-02-16T12:09:20+00:00 |
|
d0b97b19bae74b7c910bce5be26bf232c45a47bf | Recag/Rp_CommonC_663_2 | [
"region:us"
] | 2024-02-16T10:48:25+00:00 | {} | 2024-02-16T12:06:22+00:00 |
|
8c18a650a5e6f8c69b20f239ff3e25c2504b8e68 | Recag/Rp_CommonC_664_1 | [
"region:us"
] | 2024-02-16T10:48:36+00:00 | {} | 2024-02-16T12:06:47+00:00 |
|
ddb3cd40c80958fa3e3b757c6c11056a60c5034e | Recag/Rp_CommonC_664_2 | [
"region:us"
] | 2024-02-16T10:48:48+00:00 | {} | 2024-02-16T12:09:17+00:00 |
|
e849ae969ab42bdc69b30f238ff17bb1cc88b5cc | Recag/Rp_CommonC_665_1 | [
"region:us"
] | 2024-02-16T10:49:10+00:00 | {} | 2024-02-16T12:09:21+00:00 |
|
788d8c4192b574a25155c3ac4482013f902f60cf | Recag/Rp_CommonC_665_2 | [
"region:us"
] | 2024-02-16T10:49:16+00:00 | {} | 2024-02-16T12:07:08+00:00 |
|
d26fa5f64870c21464cced5977d5a51135774a25 |
# Dataset Card for Evaluation run of logicker/SkkuDS-DPO-72B-v1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [logicker/SkkuDS-DPO-72B-v1](https://huggingface.co/logicker/SkkuDS-DPO-72B-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_logicker__SkkuDS-DPO-72B-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-16T10:55:52.095277](https://huggingface.co/datasets/open-llm-leaderboard/details_logicker__SkkuDS-DPO-72B-v1/blob/main/results_2024-02-16T10-55-52.095277.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7681185312998495,
"acc_stderr": 0.02797672385731024,
"acc_norm": 0.7728008468755523,
"acc_norm_stderr": 0.02849748439769033,
"mc1": 0.41370869033047736,
"mc1_stderr": 0.0172408618120998,
"mc2": 0.595432675425976,
"mc2_stderr": 0.014511387340720846
},
"harness|arc:challenge|25": {
"acc": 0.6271331058020477,
"acc_stderr": 0.014131176760131172,
"acc_norm": 0.659556313993174,
"acc_norm_stderr": 0.013847460518892978
},
"harness|hellaswag|10": {
"acc": 0.6671977693686517,
"acc_stderr": 0.004702533775930293,
"acc_norm": 0.8599880501892053,
"acc_norm_stderr": 0.0034629026011361893
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.038201699145179055,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.038201699145179055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.881578947368421,
"acc_stderr": 0.026293995855474928,
"acc_norm": 0.881578947368421,
"acc_norm_stderr": 0.026293995855474928
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8226415094339623,
"acc_stderr": 0.023508739218846934,
"acc_norm": 0.8226415094339623,
"acc_norm_stderr": 0.023508739218846934
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.9166666666666666,
"acc_stderr": 0.023112508176051236,
"acc_norm": 0.9166666666666666,
"acc_norm_stderr": 0.023112508176051236
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.65,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7514450867052023,
"acc_stderr": 0.03295304696818317,
"acc_norm": 0.7514450867052023,
"acc_norm_stderr": 0.03295304696818317
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5490196078431373,
"acc_stderr": 0.049512182523962604,
"acc_norm": 0.5490196078431373,
"acc_norm_stderr": 0.049512182523962604
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653695,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653695
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.8042553191489362,
"acc_stderr": 0.025937853139977148,
"acc_norm": 0.8042553191489362,
"acc_norm_stderr": 0.025937853139977148
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5877192982456141,
"acc_stderr": 0.046306532033665956,
"acc_norm": 0.5877192982456141,
"acc_norm_stderr": 0.046306532033665956
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7862068965517242,
"acc_stderr": 0.03416520447747549,
"acc_norm": 0.7862068965517242,
"acc_norm_stderr": 0.03416520447747549
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.7116402116402116,
"acc_stderr": 0.02333065405453588,
"acc_norm": 0.7116402116402116,
"acc_norm_stderr": 0.02333065405453588
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5952380952380952,
"acc_stderr": 0.04390259265377563,
"acc_norm": 0.5952380952380952,
"acc_norm_stderr": 0.04390259265377563
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8838709677419355,
"acc_stderr": 0.018225757949432306,
"acc_norm": 0.8838709677419355,
"acc_norm_stderr": 0.018225757949432306
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6551724137931034,
"acc_stderr": 0.03344283744280459,
"acc_norm": 0.6551724137931034,
"acc_norm_stderr": 0.03344283744280459
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8545454545454545,
"acc_stderr": 0.027530196355066573,
"acc_norm": 0.8545454545454545,
"acc_norm_stderr": 0.027530196355066573
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9343434343434344,
"acc_stderr": 0.01764652667723333,
"acc_norm": 0.9343434343434344,
"acc_norm_stderr": 0.01764652667723333
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9896373056994818,
"acc_stderr": 0.007308424386792194,
"acc_norm": 0.9896373056994818,
"acc_norm_stderr": 0.007308424386792194
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8153846153846154,
"acc_stderr": 0.01967163241310029,
"acc_norm": 0.8153846153846154,
"acc_norm_stderr": 0.01967163241310029
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.5,
"acc_stderr": 0.030485538042484616,
"acc_norm": 0.5,
"acc_norm_stderr": 0.030485538042484616
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8445378151260504,
"acc_stderr": 0.023536818625398904,
"acc_norm": 0.8445378151260504,
"acc_norm_stderr": 0.023536818625398904
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5562913907284768,
"acc_stderr": 0.04056527902281732,
"acc_norm": 0.5562913907284768,
"acc_norm_stderr": 0.04056527902281732
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.926605504587156,
"acc_stderr": 0.011180976446357573,
"acc_norm": 0.926605504587156,
"acc_norm_stderr": 0.011180976446357573
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6990740740740741,
"acc_stderr": 0.03128039084329883,
"acc_norm": 0.6990740740740741,
"acc_norm_stderr": 0.03128039084329883
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9313725490196079,
"acc_stderr": 0.017744453647073322,
"acc_norm": 0.9313725490196079,
"acc_norm_stderr": 0.017744453647073322
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9029535864978903,
"acc_stderr": 0.019269323025640273,
"acc_norm": 0.9029535864978903,
"acc_norm_stderr": 0.019269323025640273
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7982062780269058,
"acc_stderr": 0.026936111912802273,
"acc_norm": 0.7982062780269058,
"acc_norm_stderr": 0.026936111912802273
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8702290076335878,
"acc_stderr": 0.029473649496907065,
"acc_norm": 0.8702290076335878,
"acc_norm_stderr": 0.029473649496907065
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.9090909090909091,
"acc_stderr": 0.026243194054073892,
"acc_norm": 0.9090909090909091,
"acc_norm_stderr": 0.026243194054073892
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8518518518518519,
"acc_stderr": 0.03434300243630999,
"acc_norm": 0.8518518518518519,
"acc_norm_stderr": 0.03434300243630999
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8711656441717791,
"acc_stderr": 0.02632138319878367,
"acc_norm": 0.8711656441717791,
"acc_norm_stderr": 0.02632138319878367
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6607142857142857,
"acc_stderr": 0.044939490686135404,
"acc_norm": 0.6607142857142857,
"acc_norm_stderr": 0.044939490686135404
},
"harness|hendrycksTest-management|5": {
"acc": 0.8640776699029126,
"acc_stderr": 0.03393295729761011,
"acc_norm": 0.8640776699029126,
"acc_norm_stderr": 0.03393295729761011
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9401709401709402,
"acc_stderr": 0.015537514263253874,
"acc_norm": 0.9401709401709402,
"acc_norm_stderr": 0.015537514263253874
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.86,
"acc_stderr": 0.034873508801977725,
"acc_norm": 0.86,
"acc_norm_stderr": 0.034873508801977725
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9144316730523627,
"acc_stderr": 0.010002965568647285,
"acc_norm": 0.9144316730523627,
"acc_norm_stderr": 0.010002965568647285
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8352601156069365,
"acc_stderr": 0.019971040982442262,
"acc_norm": 0.8352601156069365,
"acc_norm_stderr": 0.019971040982442262
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.6424581005586593,
"acc_stderr": 0.01602939447489489,
"acc_norm": 0.6424581005586593,
"acc_norm_stderr": 0.01602939447489489
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8562091503267973,
"acc_stderr": 0.020091188936043728,
"acc_norm": 0.8562091503267973,
"acc_norm_stderr": 0.020091188936043728
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8295819935691319,
"acc_stderr": 0.02135534302826405,
"acc_norm": 0.8295819935691319,
"acc_norm_stderr": 0.02135534302826405
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8611111111111112,
"acc_stderr": 0.01924252622654454,
"acc_norm": 0.8611111111111112,
"acc_norm_stderr": 0.01924252622654454
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.624113475177305,
"acc_stderr": 0.028893955412115882,
"acc_norm": 0.624113475177305,
"acc_norm_stderr": 0.028893955412115882
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.6140808344198174,
"acc_stderr": 0.012433398911476134,
"acc_norm": 0.6140808344198174,
"acc_norm_stderr": 0.012433398911476134
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8198529411764706,
"acc_stderr": 0.023345163616544838,
"acc_norm": 0.8198529411764706,
"acc_norm_stderr": 0.023345163616544838
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8088235294117647,
"acc_stderr": 0.015908290136278067,
"acc_norm": 0.8088235294117647,
"acc_norm_stderr": 0.015908290136278067
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7363636363636363,
"acc_stderr": 0.04220224692971987,
"acc_norm": 0.7363636363636363,
"acc_norm_stderr": 0.04220224692971987
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8326530612244898,
"acc_stderr": 0.02389714476891452,
"acc_norm": 0.8326530612244898,
"acc_norm_stderr": 0.02389714476891452
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8905472636815921,
"acc_stderr": 0.022076326101824667,
"acc_norm": 0.8905472636815921,
"acc_norm_stderr": 0.022076326101824667
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.94,
"acc_stderr": 0.023868325657594194,
"acc_norm": 0.94,
"acc_norm_stderr": 0.023868325657594194
},
"harness|hendrycksTest-virology|5": {
"acc": 0.572289156626506,
"acc_stderr": 0.03851597683718533,
"acc_norm": 0.572289156626506,
"acc_norm_stderr": 0.03851597683718533
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.024103384202072864,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.024103384202072864
},
"harness|truthfulqa:mc|0": {
"mc1": 0.41370869033047736,
"mc1_stderr": 0.0172408618120998,
"mc2": 0.595432675425976,
"mc2_stderr": 0.014511387340720846
},
"harness|winogrande|5": {
"acc": 0.8263614838200474,
"acc_stderr": 0.010646116480330996
},
"harness|gsm8k|5": {
"acc": 0.6588324488248674,
"acc_stderr": 0.013059111935831497
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_logicker__SkkuDS-DPO-72B-v1 | [
"region:us"
] | 2024-02-16T10:57:59+00:00 | {"pretty_name": "Evaluation run of logicker/SkkuDS-DPO-72B-v1", "dataset_summary": "Dataset automatically created during the evaluation run of model [logicker/SkkuDS-DPO-72B-v1](https://huggingface.co/logicker/SkkuDS-DPO-72B-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_logicker__SkkuDS-DPO-72B-v1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-16T10:55:52.095277](https://huggingface.co/datasets/open-llm-leaderboard/details_logicker__SkkuDS-DPO-72B-v1/blob/main/results_2024-02-16T10-55-52.095277.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7681185312998495,\n \"acc_stderr\": 0.02797672385731024,\n \"acc_norm\": 0.7728008468755523,\n \"acc_norm_stderr\": 0.02849748439769033,\n \"mc1\": 0.41370869033047736,\n \"mc1_stderr\": 0.0172408618120998,\n \"mc2\": 0.595432675425976,\n \"mc2_stderr\": 0.014511387340720846\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6271331058020477,\n \"acc_stderr\": 0.014131176760131172,\n \"acc_norm\": 0.659556313993174,\n \"acc_norm_stderr\": 0.013847460518892978\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6671977693686517,\n \"acc_stderr\": 0.004702533775930293,\n \"acc_norm\": 0.8599880501892053,\n \"acc_norm_stderr\": 0.0034629026011361893\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.038201699145179055,\n \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.038201699145179055\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.881578947368421,\n \"acc_stderr\": 0.026293995855474928,\n \"acc_norm\": 0.881578947368421,\n \"acc_norm_stderr\": 0.026293995855474928\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.8226415094339623,\n \"acc_stderr\": 0.023508739218846934,\n \"acc_norm\": 0.8226415094339623,\n \"acc_norm_stderr\": 0.023508739218846934\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.9166666666666666,\n \"acc_stderr\": 0.023112508176051236,\n \"acc_norm\": 0.9166666666666666,\n \"acc_norm_stderr\": 0.023112508176051236\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.04793724854411019,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.04793724854411019\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7514450867052023,\n \"acc_stderr\": 0.03295304696818317,\n \"acc_norm\": 0.7514450867052023,\n \"acc_norm_stderr\": 0.03295304696818317\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.5490196078431373,\n \"acc_stderr\": 0.049512182523962604,\n \"acc_norm\": 0.5490196078431373,\n \"acc_norm_stderr\": 0.049512182523962604\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653695,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653695\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.8042553191489362,\n \"acc_stderr\": 0.025937853139977148,\n \"acc_norm\": 0.8042553191489362,\n \"acc_norm_stderr\": 0.025937853139977148\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5877192982456141,\n \"acc_stderr\": 0.046306532033665956,\n \"acc_norm\": 0.5877192982456141,\n \"acc_norm_stderr\": 0.046306532033665956\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7862068965517242,\n \"acc_stderr\": 0.03416520447747549,\n \"acc_norm\": 0.7862068965517242,\n \"acc_norm_stderr\": 0.03416520447747549\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.7116402116402116,\n \"acc_stderr\": 0.02333065405453588,\n \"acc_norm\": 0.7116402116402116,\n \"acc_norm_stderr\": 0.02333065405453588\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5952380952380952,\n \"acc_stderr\": 0.04390259265377563,\n \"acc_norm\": 0.5952380952380952,\n \"acc_norm_stderr\": 0.04390259265377563\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8838709677419355,\n \"acc_stderr\": 0.018225757949432306,\n \"acc_norm\": 0.8838709677419355,\n \"acc_norm_stderr\": 0.018225757949432306\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6551724137931034,\n \"acc_stderr\": 0.03344283744280459,\n \"acc_norm\": 0.6551724137931034,\n \"acc_norm_stderr\": 0.03344283744280459\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8545454545454545,\n \"acc_stderr\": 0.027530196355066573,\n \"acc_norm\": 0.8545454545454545,\n \"acc_norm_stderr\": 0.027530196355066573\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9343434343434344,\n \"acc_stderr\": 0.01764652667723333,\n \"acc_norm\": 0.9343434343434344,\n \"acc_norm_stderr\": 0.01764652667723333\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9896373056994818,\n \"acc_stderr\": 0.007308424386792194,\n \"acc_norm\": 0.9896373056994818,\n \"acc_norm_stderr\": 0.007308424386792194\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.8153846153846154,\n \"acc_stderr\": 0.01967163241310029,\n \"acc_norm\": 0.8153846153846154,\n \"acc_norm_stderr\": 0.01967163241310029\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.030485538042484616,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.030485538042484616\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8445378151260504,\n \"acc_stderr\": 0.023536818625398904,\n \"acc_norm\": 0.8445378151260504,\n \"acc_norm_stderr\": 0.023536818625398904\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.5562913907284768,\n \"acc_stderr\": 0.04056527902281732,\n \"acc_norm\": 0.5562913907284768,\n \"acc_norm_stderr\": 0.04056527902281732\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.926605504587156,\n \"acc_stderr\": 0.011180976446357573,\n \"acc_norm\": 0.926605504587156,\n \"acc_norm_stderr\": 0.011180976446357573\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6990740740740741,\n \"acc_stderr\": 0.03128039084329883,\n \"acc_norm\": 0.6990740740740741,\n \"acc_norm_stderr\": 0.03128039084329883\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9313725490196079,\n \"acc_stderr\": 0.017744453647073322,\n \"acc_norm\": 0.9313725490196079,\n \"acc_norm_stderr\": 0.017744453647073322\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.9029535864978903,\n \"acc_stderr\": 0.019269323025640273,\n \"acc_norm\": 0.9029535864978903,\n \"acc_norm_stderr\": 0.019269323025640273\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7982062780269058,\n \"acc_stderr\": 0.026936111912802273,\n \"acc_norm\": 0.7982062780269058,\n \"acc_norm_stderr\": 0.026936111912802273\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8702290076335878,\n \"acc_stderr\": 0.029473649496907065,\n \"acc_norm\": 0.8702290076335878,\n \"acc_norm_stderr\": 0.029473649496907065\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.9090909090909091,\n \"acc_stderr\": 0.026243194054073892,\n \"acc_norm\": 0.9090909090909091,\n \"acc_norm_stderr\": 0.026243194054073892\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8518518518518519,\n \"acc_stderr\": 0.03434300243630999,\n \"acc_norm\": 0.8518518518518519,\n \"acc_norm_stderr\": 0.03434300243630999\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8711656441717791,\n \"acc_stderr\": 0.02632138319878367,\n \"acc_norm\": 0.8711656441717791,\n \"acc_norm_stderr\": 0.02632138319878367\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6607142857142857,\n \"acc_stderr\": 0.044939490686135404,\n \"acc_norm\": 0.6607142857142857,\n \"acc_norm_stderr\": 0.044939490686135404\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8640776699029126,\n \"acc_stderr\": 0.03393295729761011,\n \"acc_norm\": 0.8640776699029126,\n \"acc_norm_stderr\": 0.03393295729761011\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9401709401709402,\n \"acc_stderr\": 0.015537514263253874,\n \"acc_norm\": 0.9401709401709402,\n \"acc_norm_stderr\": 0.015537514263253874\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.034873508801977725,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.034873508801977725\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9144316730523627,\n \"acc_stderr\": 0.010002965568647285,\n \"acc_norm\": 0.9144316730523627,\n \"acc_norm_stderr\": 0.010002965568647285\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8352601156069365,\n \"acc_stderr\": 0.019971040982442262,\n \"acc_norm\": 0.8352601156069365,\n \"acc_norm_stderr\": 0.019971040982442262\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6424581005586593,\n \"acc_stderr\": 0.01602939447489489,\n \"acc_norm\": 0.6424581005586593,\n \"acc_norm_stderr\": 0.01602939447489489\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8562091503267973,\n \"acc_stderr\": 0.020091188936043728,\n \"acc_norm\": 0.8562091503267973,\n \"acc_norm_stderr\": 0.020091188936043728\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8295819935691319,\n \"acc_stderr\": 0.02135534302826405,\n \"acc_norm\": 0.8295819935691319,\n \"acc_norm_stderr\": 0.02135534302826405\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8611111111111112,\n \"acc_stderr\": 0.01924252622654454,\n \"acc_norm\": 0.8611111111111112,\n \"acc_norm_stderr\": 0.01924252622654454\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.624113475177305,\n \"acc_stderr\": 0.028893955412115882,\n \"acc_norm\": 0.624113475177305,\n \"acc_norm_stderr\": 0.028893955412115882\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.6140808344198174,\n \"acc_stderr\": 0.012433398911476134,\n \"acc_norm\": 0.6140808344198174,\n \"acc_norm_stderr\": 0.012433398911476134\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8198529411764706,\n \"acc_stderr\": 0.023345163616544838,\n \"acc_norm\": 0.8198529411764706,\n \"acc_norm_stderr\": 0.023345163616544838\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.8088235294117647,\n \"acc_stderr\": 0.015908290136278067,\n \"acc_norm\": 0.8088235294117647,\n \"acc_norm_stderr\": 0.015908290136278067\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7363636363636363,\n \"acc_stderr\": 0.04220224692971987,\n \"acc_norm\": 0.7363636363636363,\n \"acc_norm_stderr\": 0.04220224692971987\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8326530612244898,\n \"acc_stderr\": 0.02389714476891452,\n \"acc_norm\": 0.8326530612244898,\n \"acc_norm_stderr\": 0.02389714476891452\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8905472636815921,\n \"acc_stderr\": 0.022076326101824667,\n \"acc_norm\": 0.8905472636815921,\n \"acc_norm_stderr\": 0.022076326101824667\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.94,\n \"acc_stderr\": 0.023868325657594194,\n \"acc_norm\": 0.94,\n \"acc_norm_stderr\": 0.023868325657594194\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.572289156626506,\n \"acc_stderr\": 0.03851597683718533,\n \"acc_norm\": 0.572289156626506,\n \"acc_norm_stderr\": 0.03851597683718533\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.024103384202072864,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.024103384202072864\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.41370869033047736,\n \"mc1_stderr\": 0.0172408618120998,\n \"mc2\": 0.595432675425976,\n \"mc2_stderr\": 0.014511387340720846\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8263614838200474,\n \"acc_stderr\": 0.010646116480330996\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6588324488248674,\n \"acc_stderr\": 0.013059111935831497\n }\n}\n```", "repo_url": "https://huggingface.co/logicker/SkkuDS-DPO-72B-v1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_16T10_55_52.095277", "path": ["**/details_harness|arc:challenge|25_2024-02-16T10-55-52.095277.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-16T10-55-52.095277.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_16T10_55_52.095277", "path": ["**/details_harness|gsm8k|5_2024-02-16T10-55-52.095277.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-16T10-55-52.095277.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_16T10_55_52.095277", "path": ["**/details_harness|hellaswag|10_2024-02-16T10-55-52.095277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-16T10-55-52.095277.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_16T10_55_52.095277", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-16T10-55-52.095277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-16T10-55-52.095277.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-16T10-55-52.095277.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_16T10_55_52.095277", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T10-55-52.095277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T10-55-52.095277.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_16T10_55_52.095277", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-16T10-55-52.095277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-16T10-55-52.095277.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_16T10_55_52.095277", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-16T10-55-52.095277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-16T10-55-52.095277.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_16T10_55_52.095277", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T10-55-52.095277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T10-55-52.095277.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_16T10_55_52.095277", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T10-55-52.095277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T10-55-52.095277.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_16T10_55_52.095277", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-16T10-55-52.095277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-16T10-55-52.095277.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_16T10_55_52.095277", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T10-55-52.095277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T10-55-52.095277.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_16T10_55_52.095277", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T10-55-52.095277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T10-55-52.095277.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_16T10_55_52.095277", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T10-55-52.095277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T10-55-52.095277.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_16T10_55_52.095277", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T10-55-52.095277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T10-55-52.095277.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_16T10_55_52.095277", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-16T10-55-52.095277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-16T10-55-52.095277.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_16T10_55_52.095277", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-16T10-55-52.095277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-16T10-55-52.095277.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_16T10_55_52.095277", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T10-55-52.095277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T10-55-52.095277.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_16T10_55_52.095277", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-16T10-55-52.095277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-16T10-55-52.095277.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_16T10_55_52.095277", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T10-55-52.095277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T10-55-52.095277.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_16T10_55_52.095277", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T10-55-52.095277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T10-55-52.095277.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_16T10_55_52.095277", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T10-55-52.095277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T10-55-52.095277.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_16T10_55_52.095277", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-16T10-55-52.095277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-16T10-55-52.095277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_16T10_55_52.095277", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T10-55-52.095277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T10-55-52.095277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_16T10_55_52.095277", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T10-55-52.095277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T10-55-52.095277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_16T10_55_52.095277", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T10-55-52.095277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T10-55-52.095277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_16T10_55_52.095277", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T10-55-52.095277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T10-55-52.095277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_16T10_55_52.095277", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T10-55-52.095277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T10-55-52.095277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_16T10_55_52.095277", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T10-55-52.095277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T10-55-52.095277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_16T10_55_52.095277", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T10-55-52.095277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T10-55-52.095277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_16T10_55_52.095277", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T10-55-52.095277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T10-55-52.095277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_16T10_55_52.095277", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T10-55-52.095277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T10-55-52.095277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_16T10_55_52.095277", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T10-55-52.095277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T10-55-52.095277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_16T10_55_52.095277", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T10-55-52.095277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T10-55-52.095277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_16T10_55_52.095277", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T10-55-52.095277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T10-55-52.095277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_16T10_55_52.095277", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T10-55-52.095277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T10-55-52.095277.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_16T10_55_52.095277", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T10-55-52.095277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T10-55-52.095277.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_16T10_55_52.095277", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-16T10-55-52.095277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-16T10-55-52.095277.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_16T10_55_52.095277", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T10-55-52.095277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T10-55-52.095277.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_16T10_55_52.095277", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-16T10-55-52.095277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-16T10-55-52.095277.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_16T10_55_52.095277", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T10-55-52.095277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T10-55-52.095277.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_16T10_55_52.095277", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T10-55-52.095277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T10-55-52.095277.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_16T10_55_52.095277", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T10-55-52.095277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T10-55-52.095277.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_16T10_55_52.095277", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-16T10-55-52.095277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-16T10-55-52.095277.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_16T10_55_52.095277", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-16T10-55-52.095277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-16T10-55-52.095277.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_16T10_55_52.095277", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T10-55-52.095277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T10-55-52.095277.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_16T10_55_52.095277", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T10-55-52.095277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T10-55-52.095277.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_16T10_55_52.095277", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T10-55-52.095277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T10-55-52.095277.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_16T10_55_52.095277", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T10-55-52.095277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T10-55-52.095277.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_16T10_55_52.095277", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-16T10-55-52.095277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-16T10-55-52.095277.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_16T10_55_52.095277", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-16T10-55-52.095277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-16T10-55-52.095277.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_16T10_55_52.095277", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-16T10-55-52.095277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-16T10-55-52.095277.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_16T10_55_52.095277", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T10-55-52.095277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T10-55-52.095277.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_16T10_55_52.095277", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-16T10-55-52.095277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-16T10-55-52.095277.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_16T10_55_52.095277", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T10-55-52.095277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T10-55-52.095277.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_16T10_55_52.095277", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T10-55-52.095277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T10-55-52.095277.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_16T10_55_52.095277", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-16T10-55-52.095277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-16T10-55-52.095277.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_16T10_55_52.095277", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-16T10-55-52.095277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-16T10-55-52.095277.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_16T10_55_52.095277", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-16T10-55-52.095277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-16T10-55-52.095277.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_16T10_55_52.095277", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T10-55-52.095277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T10-55-52.095277.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_16T10_55_52.095277", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-16T10-55-52.095277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-16T10-55-52.095277.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_16T10_55_52.095277", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-16T10-55-52.095277.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-16T10-55-52.095277.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_16T10_55_52.095277", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-16T10-55-52.095277.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-16T10-55-52.095277.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_16T10_55_52.095277", "path": ["**/details_harness|winogrande|5_2024-02-16T10-55-52.095277.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-16T10-55-52.095277.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_16T10_55_52.095277", "path": ["results_2024-02-16T10-55-52.095277.parquet"]}, {"split": "latest", "path": ["results_2024-02-16T10-55-52.095277.parquet"]}]}]} | 2024-02-16T10:58:21+00:00 |
3f03bfb4a087f7bbf7803c28697ddec930015b7a | lab42/cov-vqa-raw | [
"region:us"
] | 2024-02-16T10:58:46+00:00 | {"dataset_info": {"features": [{"name": "image_0", "dtype": "image"}, {"name": "image_1", "dtype": "image"}, {"name": "image_2", "dtype": "image"}, {"name": "images_rest", "sequence": "image"}, {"name": "mask_0", "dtype": "image"}, {"name": "mask_1", "dtype": "image"}, {"name": "mask_2", "dtype": "image"}, {"name": "masks_rest", "sequence": "image"}, {"name": "conversations", "dtype": "string"}, {"name": "id", "dtype": "string"}, {"name": "dataset", "dtype": "string"}, {"name": "split", "dtype": "string"}, {"name": "n_images", "dtype": "int32"}, {"name": "n_masks", "dtype": "int32"}, {"name": "n_conversations", "dtype": "int32"}], "splits": [{"name": "validation", "num_bytes": 8969190.0, "num_examples": 75}, {"name": "train", "num_bytes": 61236642.0, "num_examples": 517}], "download_size": 228214510, "dataset_size": 70205832.0}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}]} | 2024-02-16T16:26:47+00:00 |
|
4fcc4c125132fc1d94e5255aaef7dbda82309fc7 | loubnabnl/test_amt | [
"region:us"
] | 2024-02-16T11:01:35+00:00 | {"dataset_info": {"features": [], "splits": [{"name": "train", "num_bytes": 0, "num_examples": 0}], "download_size": 324, "dataset_size": 0}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-16T11:01:36+00:00 |
|
7db11f46563330c51a5aa603f014655e61ff05ee | iazo0oz/RAG_Illegal_Working_Migrants_and_Labour_Expl | [
"region:us"
] | 2024-02-16T11:14:29+00:00 | {} | 2024-02-16T12:59:16+00:00 |
|
95f05a7c089d5c97564b4002f35283579dae04e7 | perceptron-743/good-reads-data | [
"region:us"
] | 2024-02-16T11:16:23+00:00 | {"dataset_info": {"features": [{"name": "bookId", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "series", "dtype": "string"}, {"name": "author", "dtype": "string"}, {"name": "rating", "dtype": "float64"}, {"name": "description", "dtype": "string"}, {"name": "language", "dtype": "string"}, {"name": "isbn", "dtype": "string"}, {"name": "genres", "dtype": "string"}, {"name": "characters", "dtype": "string"}, {"name": "bookFormat", "dtype": "string"}, {"name": "edition", "dtype": "string"}, {"name": "pages", "dtype": "string"}, {"name": "publisher", "dtype": "string"}, {"name": "publishDate", "dtype": "string"}, {"name": "firstPublishDate", "dtype": "string"}, {"name": "awards", "dtype": "string"}, {"name": "numRatings", "dtype": "int64"}, {"name": "ratingsByStars", "dtype": "string"}, {"name": "likedPercent", "dtype": "float64"}, {"name": "setting", "dtype": "string"}, {"name": "coverImg", "dtype": "string"}, {"name": "bbeScore", "dtype": "int64"}, {"name": "bbeVotes", "dtype": "int64"}, {"name": "price", "dtype": "string"}, {"name": "word_count", "dtype": "int64"}, {"name": "input_ids", "sequence": "int32"}, {"name": "token_type_ids", "sequence": "int8"}, {"name": "attention_mask", "sequence": "int8"}], "splits": [{"name": "train", "num_bytes": 175287336, "num_examples": 53944}], "download_size": 55861445, "dataset_size": 175287336}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-16T12:47:06+00:00 |
Subsets and Splits