sha
stringlengths 40
40
| text
stringlengths 0
13.4M
| id
stringlengths 2
117
| tags
sequence | created_at
stringlengths 25
25
| metadata
stringlengths 2
31.7M
| last_modified
stringlengths 25
25
|
---|---|---|---|---|---|---|
be6a661c7effb976d9a8042e5d55236132049883 |
# Dataset of reisen/レイセン (Touhou)
This is the dataset of reisen/レイセン (Touhou), containing 227 images and their tags.
The core tags of this character are `animal_ears, rabbit_ears, short_hair, red_eyes, blue_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 227 | 183.03 MiB | [Download](https://huggingface.co/datasets/CyberHarem/reisen_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 227 | 125.25 MiB | [Download](https://huggingface.co/datasets/CyberHarem/reisen_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 463 | 249.41 MiB | [Download](https://huggingface.co/datasets/CyberHarem/reisen_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 227 | 167.26 MiB | [Download](https://huggingface.co/datasets/CyberHarem/reisen_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 463 | 321.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/reisen_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/reisen_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 19 |  |  |  |  |  | 1girl, long_sleeves, red_necktie, solo, collared_shirt, white_shirt, black_jacket, rifle, pleated_skirt, looking_at_viewer, standing, bangs, pink_skirt, blazer, holding_gun, crescent_pin, open_mouth, smile, blush, hair_between_eyes, buttons, one-hour_drawing_challenge, simple_background |
| 1 | 10 |  |  |  |  |  | 1girl, collared_shirt, long_sleeves, red_necktie, solo, white_shirt, blazer, pleated_skirt, looking_at_viewer, white_background, simple_background, cowboy_shot, open_mouth, rabbit_girl, rabbit_tail, black_jacket, crescent_pin, pink_skirt, bangs, closed_mouth, floppy_ears |
| 2 | 8 |  |  |  |  |  | 1girl, blazer, necktie, purple_hair, skirt, solo, rabbit_tail, open_mouth |
| 3 | 18 |  |  |  |  |  | 1girl, solo, blazer, necktie, skirt, black_thighhighs, smile, zettai_ryouiki, open_mouth |
| 4 | 7 |  |  |  |  |  | 1girl, solo, bat_wings, dress, looking_at_viewer, short_sleeves, smile, wrist_cuffs, mob_cap, multiple_girls, open_mouth, puffy_sleeves |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | long_sleeves | red_necktie | solo | collared_shirt | white_shirt | black_jacket | rifle | pleated_skirt | looking_at_viewer | standing | bangs | pink_skirt | blazer | holding_gun | crescent_pin | open_mouth | smile | blush | hair_between_eyes | buttons | one-hour_drawing_challenge | simple_background | white_background | cowboy_shot | rabbit_girl | rabbit_tail | closed_mouth | floppy_ears | necktie | purple_hair | skirt | black_thighhighs | zettai_ryouiki | bat_wings | dress | short_sleeves | wrist_cuffs | mob_cap | multiple_girls | puffy_sleeves |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:--------------|:-------|:-----------------|:--------------|:---------------|:--------|:----------------|:--------------------|:-----------|:--------|:-------------|:---------|:--------------|:---------------|:-------------|:--------|:--------|:--------------------|:----------|:-----------------------------|:--------------------|:-------------------|:--------------|:--------------|:--------------|:---------------|:--------------|:----------|:--------------|:--------|:-------------------|:-----------------|:------------|:--------|:----------------|:--------------|:----------|:-----------------|:----------------|
| 0 | 19 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | |
| 1 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | | X | X | | X | X | X | | X | X | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 2 | 8 |  |  |  |  |  | X | | | X | | | | | | | | | | X | | | X | | | | | | | | | | X | | | X | X | X | | | | | | | | | |
| 3 | 18 |  |  |  |  |  | X | | | X | | | | | | | | | | X | | | X | X | | | | | | | | | | | | X | | X | X | X | | | | | | | |
| 4 | 7 |  |  |  |  |  | X | | | X | | | | | | X | | | | | | | X | X | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X |
| CyberHarem/reisen_touhou | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-15T01:04:59+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-15T02:34:41+00:00 |
f4c88694e2f4541bd571b6130c3e354b55d9da52 |
# Dataset of teireida_mai/丁礼田舞 (Touhou)
This is the dataset of teireida_mai/丁礼田舞 (Touhou), containing 328 images and their tags.
The core tags of this character are `green_hair, green_eyes, hat, black_headwear, bow, sidelocks, bangs, yellow_bow`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 328 | 271.31 MiB | [Download](https://huggingface.co/datasets/CyberHarem/teireida_mai_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 328 | 193.56 MiB | [Download](https://huggingface.co/datasets/CyberHarem/teireida_mai_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 653 | 357.67 MiB | [Download](https://huggingface.co/datasets/CyberHarem/teireida_mai_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 328 | 252.53 MiB | [Download](https://huggingface.co/datasets/CyberHarem/teireida_mai_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 653 | 457.21 MiB | [Download](https://huggingface.co/datasets/CyberHarem/teireida_mai_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/teireida_mai_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 31 |  |  |  |  |  | green_dress, short_hair_with_long_locks, 1girl, solo, waist_apron, black_socks, looking_at_viewer, tate_eboshi, full_body, green_footwear, bamboo, holding, frills, white_background, open_mouth, kneehighs, mary_janes, puffy_short_sleeves, simple_background, :d, white_apron, yellow_ribbon |
| 1 | 5 |  |  |  |  |  | 1girl, bamboo, green_dress, looking_at_viewer, open_mouth, puffy_short_sleeves, short_hair_with_long_locks, solo, tate_eboshi, waist_apron, :d, frills, holding, simple_background, white_apron, green_background |
| 2 | 5 |  |  |  |  |  | 2girls, brown_hair, frills, green_dress, puffy_short_sleeves, short_hair_with_long_locks, tate_eboshi, waist_apron, bamboo, pink_dress, holding, white_apron, grin, looking_at_viewer, solo_focus, star_(symbol) |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | green_dress | short_hair_with_long_locks | 1girl | solo | waist_apron | black_socks | looking_at_viewer | tate_eboshi | full_body | green_footwear | bamboo | holding | frills | white_background | open_mouth | kneehighs | mary_janes | puffy_short_sleeves | simple_background | :d | white_apron | yellow_ribbon | green_background | 2girls | brown_hair | pink_dress | grin | solo_focus | star_(symbol) |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------|:-----------------------------|:--------|:-------|:--------------|:--------------|:--------------------|:--------------|:------------|:-----------------|:---------|:----------|:---------|:-------------------|:-------------|:------------|:-------------|:----------------------|:--------------------|:-----|:--------------|:----------------|:-------------------|:---------|:-------------|:-------------|:-------|:-------------|:----------------|
| 0 | 31 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | X | | X | X | | | X | X | X | | X | | | X | X | X | X | | X | | | | | | |
| 2 | 5 |  |  |  |  |  | X | X | | | X | | X | X | | | X | X | X | | | | | X | | | X | | | X | X | X | X | X | X |
| CyberHarem/teireida_mai_touhou | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-15T01:05:05+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-15T02:20:44+00:00 |
3a9df689d5b5c4e8a623719c31f19a48b2ac2399 | This dataset accompanies the following publication, please cite this publication if you use this dataset:
Fischer, T. and Milford, M., 2020. Event-Based Visual Place Recognition With Ensembles of Temporal Windows. IEEE Robotics and Automation Letters, 5(4), pp.6924-6931.
```bibtex
@article{fischer2020event,
title={Event-Based Visual Place Recognition With Ensembles of Temporal Windows},
author={Fischer, Tobias and Milford, Michael},
journal={IEEE Robotics and Automation Letters},
volume={5},
number={4},
pages={6924--6931},
year={2020}
}
```
The dataset contains five sequences of recordings. For each recording, a denoised `parquet` file is made available.
The source files for these `parquet` files can be found on [Zenodo](https://zenodo.org/records/4302805).
We also provide associated GPS information (`*.nmea`) files recorded using the consumer camera.
Please see the [associated code repository](https://github.com/Tobias-Fischer/sparse-event-vpr) for more information. | TobiasRobotics/brisbane-event-vpr | [
"license:cc-by-nc-sa-4.0",
"computer vision",
"robotics",
"event cameras",
"region:us"
] | 2024-01-15T01:11:21+00:00 | {"license": "cc-by-nc-sa-4.0", "pretty_name": "Brisbane Event VPR", "tags": ["computer vision", "robotics", "event cameras"], "arxiv": 2006.02826} | 2024-01-15T01:29:19+00:00 |
cce60cbe9eb7c609fd005e0e142b0ecf61aef4c2 | Hiraishin/ujianjpj-test-prep | [
"license:apache-2.0",
"region:us"
] | 2024-01-15T01:17:07+00:00 | {"license": "apache-2.0"} | 2024-01-15T01:21:15+00:00 |
|
9d4066eebaed8b16a8df3fe9eb34e7e622417ea6 | AlcNdr/AlcVoice | [
"license:unknown",
"region:us"
] | 2024-01-15T01:25:55+00:00 | {"license": "unknown"} | 2024-01-15T01:26:21+00:00 |
|
14eff720d98f9f8eb9655db51a90b6114c7c5a9e | Tsuinzues/estrelapolar | [
"license:openrail",
"region:us"
] | 2024-01-15T01:37:09+00:00 | {"license": "openrail"} | 2024-01-15T01:37:23+00:00 |
|
42b2718cedf0b94b350dcbfedb625e2f336ac0ee | zhaospei/scg-v2 | [
"region:us"
] | 2024-01-15T01:37:45+00:00 | {} | 2024-01-15T02:53:49+00:00 |
|
1f74aabd0dbf73179f914e77a26b2389516955bb |
# Dataset Card for Evaluation run of NeuralNovel/Gecko-7B-v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [NeuralNovel/Gecko-7B-v0.1](https://huggingface.co/NeuralNovel/Gecko-7B-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_NeuralNovel__Gecko-7B-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-16T16:13:12.225780](https://huggingface.co/datasets/open-llm-leaderboard/details_NeuralNovel__Gecko-7B-v0.1/blob/main/results_2024-01-16T16-13-12.225780.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6099096028262384,
"acc_stderr": 0.03317410149444282,
"acc_norm": 0.6143554464489048,
"acc_norm_stderr": 0.03384780111199933,
"mc1": 0.4638922888616891,
"mc1_stderr": 0.017457800422268622,
"mc2": 0.6260121840084173,
"mc2_stderr": 0.015381860069987416
},
"harness|arc:challenge|25": {
"acc": 0.5656996587030717,
"acc_stderr": 0.014484703048857359,
"acc_norm": 0.613481228668942,
"acc_norm_stderr": 0.014230084761910478
},
"harness|hellaswag|10": {
"acc": 0.6475801633140809,
"acc_stderr": 0.004767475366689761,
"acc_norm": 0.8335988846843259,
"acc_norm_stderr": 0.0037167914663914794
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6052631578947368,
"acc_stderr": 0.039777499346220734,
"acc_norm": 0.6052631578947368,
"acc_norm_stderr": 0.039777499346220734
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6566037735849056,
"acc_stderr": 0.02922452646912479,
"acc_norm": 0.6566037735849056,
"acc_norm_stderr": 0.02922452646912479
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6736111111111112,
"acc_stderr": 0.03921067198982266,
"acc_norm": 0.6736111111111112,
"acc_norm_stderr": 0.03921067198982266
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5895953757225434,
"acc_stderr": 0.03750757044895537,
"acc_norm": 0.5895953757225434,
"acc_norm_stderr": 0.03750757044895537
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.49361702127659574,
"acc_stderr": 0.032683358999363366,
"acc_norm": 0.49361702127659574,
"acc_norm_stderr": 0.032683358999363366
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.04685473041907789,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.04685473041907789
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.593103448275862,
"acc_stderr": 0.04093793981266236,
"acc_norm": 0.593103448275862,
"acc_norm_stderr": 0.04093793981266236
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.024870815251057093,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.024870815251057093
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.043758884927270605,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.043758884927270605
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6612903225806451,
"acc_stderr": 0.026923446059302844,
"acc_norm": 0.6612903225806451,
"acc_norm_stderr": 0.026923446059302844
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.47783251231527096,
"acc_stderr": 0.03514528562175007,
"acc_norm": 0.47783251231527096,
"acc_norm_stderr": 0.03514528562175007
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.03374402644139404,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.03374402644139404
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.029857515673386417,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.029857515673386417
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.024639789097709437,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.024639789097709437
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5923076923076923,
"acc_stderr": 0.024915243985987847,
"acc_norm": 0.5923076923076923,
"acc_norm_stderr": 0.024915243985987847
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.028820884666253255,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.028820884666253255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6134453781512605,
"acc_stderr": 0.03163145807552378,
"acc_norm": 0.6134453781512605,
"acc_norm_stderr": 0.03163145807552378
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.39072847682119205,
"acc_stderr": 0.03983798306659806,
"acc_norm": 0.39072847682119205,
"acc_norm_stderr": 0.03983798306659806
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7944954128440367,
"acc_stderr": 0.01732435232501601,
"acc_norm": 0.7944954128440367,
"acc_norm_stderr": 0.01732435232501601
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.03400603625538271,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.03400603625538271
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7598039215686274,
"acc_stderr": 0.02998373305591361,
"acc_norm": 0.7598039215686274,
"acc_norm_stderr": 0.02998373305591361
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7637130801687764,
"acc_stderr": 0.027652153144159263,
"acc_norm": 0.7637130801687764,
"acc_norm_stderr": 0.027652153144159263
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6457399103139013,
"acc_stderr": 0.03210062154134987,
"acc_norm": 0.6457399103139013,
"acc_norm_stderr": 0.03210062154134987
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7251908396946565,
"acc_stderr": 0.039153454088478354,
"acc_norm": 0.7251908396946565,
"acc_norm_stderr": 0.039153454088478354
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.04489931073591312,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.04489931073591312
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7177914110429447,
"acc_stderr": 0.03536117886664742,
"acc_norm": 0.7177914110429447,
"acc_norm_stderr": 0.03536117886664742
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.04726835553719099,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.04726835553719099
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.04354631077260595,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.04354631077260595
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281348,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281348
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7867177522349936,
"acc_stderr": 0.014648172749593517,
"acc_norm": 0.7867177522349936,
"acc_norm_stderr": 0.014648172749593517
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.024946792225272314,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.024946792225272314
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.35083798882681566,
"acc_stderr": 0.015961036675230963,
"acc_norm": 0.35083798882681566,
"acc_norm_stderr": 0.015961036675230963
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7189542483660131,
"acc_stderr": 0.025738854797818733,
"acc_norm": 0.7189542483660131,
"acc_norm_stderr": 0.025738854797818733
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.662379421221865,
"acc_stderr": 0.026858825879488544,
"acc_norm": 0.662379421221865,
"acc_norm_stderr": 0.026858825879488544
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6728395061728395,
"acc_stderr": 0.026105673861409825,
"acc_norm": 0.6728395061728395,
"acc_norm_stderr": 0.026105673861409825
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46099290780141844,
"acc_stderr": 0.02973659252642444,
"acc_norm": 0.46099290780141844,
"acc_norm_stderr": 0.02973659252642444
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.43089960886571055,
"acc_stderr": 0.012647695889547235,
"acc_norm": 0.43089960886571055,
"acc_norm_stderr": 0.012647695889547235
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.02952009569768776,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.02952009569768776
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6225490196078431,
"acc_stderr": 0.019610851474880283,
"acc_norm": 0.6225490196078431,
"acc_norm_stderr": 0.019610851474880283
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6979591836734694,
"acc_stderr": 0.029393609319879804,
"acc_norm": 0.6979591836734694,
"acc_norm_stderr": 0.029393609319879804
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7860696517412935,
"acc_stderr": 0.02899690969332891,
"acc_norm": 0.7860696517412935,
"acc_norm_stderr": 0.02899690969332891
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5060240963855421,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.5060240963855421,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4638922888616891,
"mc1_stderr": 0.017457800422268622,
"mc2": 0.6260121840084173,
"mc2_stderr": 0.015381860069987416
},
"harness|winogrande|5": {
"acc": 0.7758484609313339,
"acc_stderr": 0.011720400740774094
},
"harness|gsm8k|5": {
"acc": 0.41546626231993933,
"acc_stderr": 0.013574222625031811
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_NeuralNovel__Gecko-7B-v0.1 | [
"region:us"
] | 2024-01-15T01:39:43+00:00 | {"pretty_name": "Evaluation run of NeuralNovel/Gecko-7B-v0.1", "dataset_summary": "Dataset automatically created during the evaluation run of model [NeuralNovel/Gecko-7B-v0.1](https://huggingface.co/NeuralNovel/Gecko-7B-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NeuralNovel__Gecko-7B-v0.1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-16T16:13:12.225780](https://huggingface.co/datasets/open-llm-leaderboard/details_NeuralNovel__Gecko-7B-v0.1/blob/main/results_2024-01-16T16-13-12.225780.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6099096028262384,\n \"acc_stderr\": 0.03317410149444282,\n \"acc_norm\": 0.6143554464489048,\n \"acc_norm_stderr\": 0.03384780111199933,\n \"mc1\": 0.4638922888616891,\n \"mc1_stderr\": 0.017457800422268622,\n \"mc2\": 0.6260121840084173,\n \"mc2_stderr\": 0.015381860069987416\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5656996587030717,\n \"acc_stderr\": 0.014484703048857359,\n \"acc_norm\": 0.613481228668942,\n \"acc_norm_stderr\": 0.014230084761910478\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6475801633140809,\n \"acc_stderr\": 0.004767475366689761,\n \"acc_norm\": 0.8335988846843259,\n \"acc_norm_stderr\": 0.0037167914663914794\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6052631578947368,\n \"acc_stderr\": 0.039777499346220734,\n \"acc_norm\": 0.6052631578947368,\n \"acc_norm_stderr\": 0.039777499346220734\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6566037735849056,\n \"acc_stderr\": 0.02922452646912479,\n \"acc_norm\": 0.6566037735849056,\n \"acc_norm_stderr\": 0.02922452646912479\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6736111111111112,\n \"acc_stderr\": 0.03921067198982266,\n \"acc_norm\": 0.6736111111111112,\n \"acc_norm_stderr\": 0.03921067198982266\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5895953757225434,\n \"acc_stderr\": 0.03750757044895537,\n \"acc_norm\": 0.5895953757225434,\n \"acc_norm_stderr\": 0.03750757044895537\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.49361702127659574,\n \"acc_stderr\": 0.032683358999363366,\n \"acc_norm\": 0.49361702127659574,\n \"acc_norm_stderr\": 0.032683358999363366\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n \"acc_stderr\": 0.04685473041907789,\n \"acc_norm\": 0.45614035087719296,\n \"acc_norm_stderr\": 0.04685473041907789\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.593103448275862,\n \"acc_stderr\": 0.04093793981266236,\n \"acc_norm\": 0.593103448275862,\n \"acc_norm_stderr\": 0.04093793981266236\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.37037037037037035,\n \"acc_stderr\": 0.024870815251057093,\n \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.024870815251057093\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n \"acc_stderr\": 0.043758884927270605,\n \"acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.043758884927270605\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6612903225806451,\n \"acc_stderr\": 0.026923446059302844,\n \"acc_norm\": 0.6612903225806451,\n \"acc_norm_stderr\": 0.026923446059302844\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.47783251231527096,\n \"acc_stderr\": 0.03514528562175007,\n \"acc_norm\": 0.47783251231527096,\n \"acc_norm_stderr\": 0.03514528562175007\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.03374402644139404,\n \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.03374402644139404\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7727272727272727,\n \"acc_stderr\": 0.029857515673386417,\n \"acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.029857515673386417\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.024639789097709437,\n \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.024639789097709437\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5923076923076923,\n \"acc_stderr\": 0.024915243985987847,\n \"acc_norm\": 0.5923076923076923,\n \"acc_norm_stderr\": 0.024915243985987847\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253255,\n \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253255\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6134453781512605,\n \"acc_stderr\": 0.03163145807552378,\n \"acc_norm\": 0.6134453781512605,\n \"acc_norm_stderr\": 0.03163145807552378\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.39072847682119205,\n \"acc_stderr\": 0.03983798306659806,\n \"acc_norm\": 0.39072847682119205,\n \"acc_norm_stderr\": 0.03983798306659806\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7944954128440367,\n \"acc_stderr\": 0.01732435232501601,\n \"acc_norm\": 0.7944954128440367,\n \"acc_norm_stderr\": 0.01732435232501601\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.46296296296296297,\n \"acc_stderr\": 0.03400603625538271,\n \"acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.03400603625538271\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7598039215686274,\n \"acc_stderr\": 0.02998373305591361,\n \"acc_norm\": 0.7598039215686274,\n \"acc_norm_stderr\": 0.02998373305591361\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7637130801687764,\n \"acc_stderr\": 0.027652153144159263,\n \"acc_norm\": 0.7637130801687764,\n \"acc_norm_stderr\": 0.027652153144159263\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6457399103139013,\n \"acc_stderr\": 0.03210062154134987,\n \"acc_norm\": 0.6457399103139013,\n \"acc_norm_stderr\": 0.03210062154134987\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7251908396946565,\n \"acc_stderr\": 0.039153454088478354,\n \"acc_norm\": 0.7251908396946565,\n \"acc_norm_stderr\": 0.039153454088478354\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6851851851851852,\n \"acc_stderr\": 0.04489931073591312,\n \"acc_norm\": 0.6851851851851852,\n \"acc_norm_stderr\": 0.04489931073591312\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7177914110429447,\n \"acc_stderr\": 0.03536117886664742,\n \"acc_norm\": 0.7177914110429447,\n \"acc_norm_stderr\": 0.03536117886664742\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n \"acc_stderr\": 0.04726835553719099,\n \"acc_norm\": 0.45535714285714285,\n \"acc_norm_stderr\": 0.04726835553719099\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260595,\n \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260595\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n \"acc_stderr\": 0.021586494001281348,\n \"acc_norm\": 0.8760683760683761,\n \"acc_norm_stderr\": 0.021586494001281348\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7867177522349936,\n \"acc_stderr\": 0.014648172749593517,\n \"acc_norm\": 0.7867177522349936,\n \"acc_norm_stderr\": 0.014648172749593517\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6878612716763006,\n \"acc_stderr\": 0.024946792225272314,\n \"acc_norm\": 0.6878612716763006,\n \"acc_norm_stderr\": 0.024946792225272314\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.35083798882681566,\n \"acc_stderr\": 0.015961036675230963,\n \"acc_norm\": 0.35083798882681566,\n \"acc_norm_stderr\": 0.015961036675230963\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.025738854797818733,\n \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.025738854797818733\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.662379421221865,\n \"acc_stderr\": 0.026858825879488544,\n \"acc_norm\": 0.662379421221865,\n \"acc_norm_stderr\": 0.026858825879488544\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6728395061728395,\n \"acc_stderr\": 0.026105673861409825,\n \"acc_norm\": 0.6728395061728395,\n \"acc_norm_stderr\": 0.026105673861409825\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.46099290780141844,\n \"acc_stderr\": 0.02973659252642444,\n \"acc_norm\": 0.46099290780141844,\n \"acc_norm_stderr\": 0.02973659252642444\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.43089960886571055,\n \"acc_stderr\": 0.012647695889547235,\n \"acc_norm\": 0.43089960886571055,\n \"acc_norm_stderr\": 0.012647695889547235\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.02952009569768776,\n \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.02952009569768776\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6225490196078431,\n \"acc_stderr\": 0.019610851474880283,\n \"acc_norm\": 0.6225490196078431,\n \"acc_norm_stderr\": 0.019610851474880283\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6979591836734694,\n \"acc_stderr\": 0.029393609319879804,\n \"acc_norm\": 0.6979591836734694,\n \"acc_norm_stderr\": 0.029393609319879804\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7860696517412935,\n \"acc_stderr\": 0.02899690969332891,\n \"acc_norm\": 0.7860696517412935,\n \"acc_norm_stderr\": 0.02899690969332891\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4638922888616891,\n \"mc1_stderr\": 0.017457800422268622,\n \"mc2\": 0.6260121840084173,\n \"mc2_stderr\": 0.015381860069987416\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7758484609313339,\n \"acc_stderr\": 0.011720400740774094\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.41546626231993933,\n \"acc_stderr\": 0.013574222625031811\n }\n}\n```", "repo_url": "https://huggingface.co/NeuralNovel/Gecko-7B-v0.1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_15T01_37_25.127753", "path": ["**/details_harness|arc:challenge|25_2024-01-15T01-37-25.127753.parquet"]}, {"split": "2024_01_16T02_41_01.393804", "path": ["**/details_harness|arc:challenge|25_2024-01-16T02-41-01.393804.parquet"]}, {"split": "2024_01_16T16_13_12.225780", "path": ["**/details_harness|arc:challenge|25_2024-01-16T16-13-12.225780.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-16T16-13-12.225780.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_15T01_37_25.127753", "path": ["**/details_harness|gsm8k|5_2024-01-15T01-37-25.127753.parquet"]}, {"split": "2024_01_16T02_41_01.393804", "path": ["**/details_harness|gsm8k|5_2024-01-16T02-41-01.393804.parquet"]}, {"split": "2024_01_16T16_13_12.225780", "path": ["**/details_harness|gsm8k|5_2024-01-16T16-13-12.225780.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-16T16-13-12.225780.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_15T01_37_25.127753", "path": ["**/details_harness|hellaswag|10_2024-01-15T01-37-25.127753.parquet"]}, {"split": "2024_01_16T02_41_01.393804", "path": ["**/details_harness|hellaswag|10_2024-01-16T02-41-01.393804.parquet"]}, {"split": "2024_01_16T16_13_12.225780", "path": ["**/details_harness|hellaswag|10_2024-01-16T16-13-12.225780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-16T16-13-12.225780.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_15T01_37_25.127753", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T01-37-25.127753.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-15T01-37-25.127753.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-15T01-37-25.127753.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T01-37-25.127753.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T01-37-25.127753.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-15T01-37-25.127753.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T01-37-25.127753.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T01-37-25.127753.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T01-37-25.127753.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T01-37-25.127753.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-15T01-37-25.127753.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-15T01-37-25.127753.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T01-37-25.127753.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-15T01-37-25.127753.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T01-37-25.127753.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T01-37-25.127753.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T01-37-25.127753.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-15T01-37-25.127753.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T01-37-25.127753.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T01-37-25.127753.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T01-37-25.127753.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T01-37-25.127753.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T01-37-25.127753.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T01-37-25.127753.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T01-37-25.127753.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T01-37-25.127753.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T01-37-25.127753.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T01-37-25.127753.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T01-37-25.127753.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T01-37-25.127753.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T01-37-25.127753.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T01-37-25.127753.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-15T01-37-25.127753.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T01-37-25.127753.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-15T01-37-25.127753.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T01-37-25.127753.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T01-37-25.127753.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T01-37-25.127753.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-15T01-37-25.127753.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-15T01-37-25.127753.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T01-37-25.127753.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T01-37-25.127753.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T01-37-25.127753.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T01-37-25.127753.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-15T01-37-25.127753.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-15T01-37-25.127753.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-15T01-37-25.127753.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T01-37-25.127753.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-15T01-37-25.127753.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T01-37-25.127753.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T01-37-25.127753.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-15T01-37-25.127753.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-15T01-37-25.127753.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-15T01-37-25.127753.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T01-37-25.127753.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-15T01-37-25.127753.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-15T01-37-25.127753.parquet"]}, {"split": "2024_01_16T02_41_01.393804", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T02-41-01.393804.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-16T02-41-01.393804.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-16T02-41-01.393804.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T02-41-01.393804.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T02-41-01.393804.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-16T02-41-01.393804.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T02-41-01.393804.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T02-41-01.393804.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T02-41-01.393804.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T02-41-01.393804.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-16T02-41-01.393804.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-16T02-41-01.393804.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T02-41-01.393804.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-16T02-41-01.393804.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T02-41-01.393804.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T02-41-01.393804.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T02-41-01.393804.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-16T02-41-01.393804.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T02-41-01.393804.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T02-41-01.393804.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T02-41-01.393804.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T02-41-01.393804.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T02-41-01.393804.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T02-41-01.393804.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T02-41-01.393804.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T02-41-01.393804.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T02-41-01.393804.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T02-41-01.393804.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T02-41-01.393804.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T02-41-01.393804.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T02-41-01.393804.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T02-41-01.393804.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-16T02-41-01.393804.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T02-41-01.393804.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-16T02-41-01.393804.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T02-41-01.393804.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T02-41-01.393804.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T02-41-01.393804.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-16T02-41-01.393804.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-16T02-41-01.393804.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T02-41-01.393804.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T02-41-01.393804.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T02-41-01.393804.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T02-41-01.393804.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-16T02-41-01.393804.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-16T02-41-01.393804.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-16T02-41-01.393804.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T02-41-01.393804.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-16T02-41-01.393804.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T02-41-01.393804.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T02-41-01.393804.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-16T02-41-01.393804.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-16T02-41-01.393804.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-16T02-41-01.393804.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T02-41-01.393804.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-16T02-41-01.393804.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-16T02-41-01.393804.parquet"]}, {"split": "2024_01_16T16_13_12.225780", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-16T16-13-12.225780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-16T16-13-12.225780.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-16T16-13-12.225780.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_15T01_37_25.127753", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T01-37-25.127753.parquet"]}, {"split": "2024_01_16T02_41_01.393804", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T02-41-01.393804.parquet"]}, {"split": "2024_01_16T16_13_12.225780", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T16-13-12.225780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T16-13-12.225780.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_15T01_37_25.127753", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-15T01-37-25.127753.parquet"]}, {"split": "2024_01_16T02_41_01.393804", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-16T02-41-01.393804.parquet"]}, {"split": "2024_01_16T16_13_12.225780", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-16T16-13-12.225780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-16T16-13-12.225780.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_15T01_37_25.127753", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-15T01-37-25.127753.parquet"]}, {"split": "2024_01_16T02_41_01.393804", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-16T02-41-01.393804.parquet"]}, {"split": "2024_01_16T16_13_12.225780", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-16T16-13-12.225780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-16T16-13-12.225780.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_15T01_37_25.127753", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T01-37-25.127753.parquet"]}, {"split": "2024_01_16T02_41_01.393804", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T02-41-01.393804.parquet"]}, {"split": "2024_01_16T16_13_12.225780", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T16-13-12.225780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T16-13-12.225780.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_15T01_37_25.127753", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T01-37-25.127753.parquet"]}, {"split": "2024_01_16T02_41_01.393804", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T02-41-01.393804.parquet"]}, {"split": "2024_01_16T16_13_12.225780", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T16-13-12.225780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T16-13-12.225780.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_15T01_37_25.127753", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-15T01-37-25.127753.parquet"]}, {"split": "2024_01_16T02_41_01.393804", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-16T02-41-01.393804.parquet"]}, {"split": "2024_01_16T16_13_12.225780", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-16T16-13-12.225780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-16T16-13-12.225780.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_15T01_37_25.127753", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T01-37-25.127753.parquet"]}, {"split": "2024_01_16T02_41_01.393804", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T02-41-01.393804.parquet"]}, {"split": "2024_01_16T16_13_12.225780", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T16-13-12.225780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T16-13-12.225780.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_15T01_37_25.127753", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T01-37-25.127753.parquet"]}, {"split": "2024_01_16T02_41_01.393804", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T02-41-01.393804.parquet"]}, {"split": "2024_01_16T16_13_12.225780", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T16-13-12.225780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T16-13-12.225780.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_15T01_37_25.127753", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T01-37-25.127753.parquet"]}, {"split": "2024_01_16T02_41_01.393804", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T02-41-01.393804.parquet"]}, {"split": "2024_01_16T16_13_12.225780", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T16-13-12.225780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T16-13-12.225780.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_15T01_37_25.127753", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T01-37-25.127753.parquet"]}, {"split": "2024_01_16T02_41_01.393804", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T02-41-01.393804.parquet"]}, {"split": "2024_01_16T16_13_12.225780", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T16-13-12.225780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T16-13-12.225780.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_15T01_37_25.127753", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-15T01-37-25.127753.parquet"]}, {"split": "2024_01_16T02_41_01.393804", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-16T02-41-01.393804.parquet"]}, {"split": "2024_01_16T16_13_12.225780", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-16T16-13-12.225780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-16T16-13-12.225780.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_15T01_37_25.127753", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-15T01-37-25.127753.parquet"]}, {"split": "2024_01_16T02_41_01.393804", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-16T02-41-01.393804.parquet"]}, {"split": "2024_01_16T16_13_12.225780", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-16T16-13-12.225780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-16T16-13-12.225780.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_15T01_37_25.127753", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T01-37-25.127753.parquet"]}, {"split": "2024_01_16T02_41_01.393804", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T02-41-01.393804.parquet"]}, {"split": "2024_01_16T16_13_12.225780", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T16-13-12.225780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T16-13-12.225780.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_15T01_37_25.127753", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-15T01-37-25.127753.parquet"]}, {"split": "2024_01_16T02_41_01.393804", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-16T02-41-01.393804.parquet"]}, {"split": "2024_01_16T16_13_12.225780", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-16T16-13-12.225780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-16T16-13-12.225780.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_15T01_37_25.127753", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T01-37-25.127753.parquet"]}, {"split": "2024_01_16T02_41_01.393804", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T02-41-01.393804.parquet"]}, {"split": "2024_01_16T16_13_12.225780", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T16-13-12.225780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T16-13-12.225780.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_15T01_37_25.127753", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T01-37-25.127753.parquet"]}, {"split": "2024_01_16T02_41_01.393804", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T02-41-01.393804.parquet"]}, {"split": "2024_01_16T16_13_12.225780", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T16-13-12.225780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T16-13-12.225780.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_15T01_37_25.127753", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T01-37-25.127753.parquet"]}, {"split": "2024_01_16T02_41_01.393804", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T02-41-01.393804.parquet"]}, {"split": "2024_01_16T16_13_12.225780", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T16-13-12.225780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T16-13-12.225780.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_15T01_37_25.127753", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-15T01-37-25.127753.parquet"]}, {"split": "2024_01_16T02_41_01.393804", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-16T02-41-01.393804.parquet"]}, {"split": "2024_01_16T16_13_12.225780", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-16T16-13-12.225780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-16T16-13-12.225780.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_15T01_37_25.127753", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T01-37-25.127753.parquet"]}, {"split": "2024_01_16T02_41_01.393804", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T02-41-01.393804.parquet"]}, {"split": "2024_01_16T16_13_12.225780", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T16-13-12.225780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T16-13-12.225780.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_15T01_37_25.127753", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T01-37-25.127753.parquet"]}, {"split": "2024_01_16T02_41_01.393804", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T02-41-01.393804.parquet"]}, {"split": "2024_01_16T16_13_12.225780", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T16-13-12.225780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T16-13-12.225780.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_15T01_37_25.127753", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T01-37-25.127753.parquet"]}, {"split": "2024_01_16T02_41_01.393804", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T02-41-01.393804.parquet"]}, {"split": "2024_01_16T16_13_12.225780", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T16-13-12.225780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T16-13-12.225780.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_15T01_37_25.127753", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T01-37-25.127753.parquet"]}, {"split": "2024_01_16T02_41_01.393804", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T02-41-01.393804.parquet"]}, {"split": "2024_01_16T16_13_12.225780", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T16-13-12.225780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T16-13-12.225780.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_15T01_37_25.127753", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T01-37-25.127753.parquet"]}, {"split": "2024_01_16T02_41_01.393804", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T02-41-01.393804.parquet"]}, {"split": "2024_01_16T16_13_12.225780", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T16-13-12.225780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T16-13-12.225780.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_15T01_37_25.127753", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T01-37-25.127753.parquet"]}, {"split": "2024_01_16T02_41_01.393804", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T02-41-01.393804.parquet"]}, {"split": "2024_01_16T16_13_12.225780", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T16-13-12.225780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T16-13-12.225780.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_15T01_37_25.127753", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T01-37-25.127753.parquet"]}, {"split": "2024_01_16T02_41_01.393804", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T02-41-01.393804.parquet"]}, {"split": "2024_01_16T16_13_12.225780", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T16-13-12.225780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T16-13-12.225780.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_15T01_37_25.127753", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T01-37-25.127753.parquet"]}, {"split": "2024_01_16T02_41_01.393804", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T02-41-01.393804.parquet"]}, {"split": "2024_01_16T16_13_12.225780", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T16-13-12.225780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T16-13-12.225780.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_15T01_37_25.127753", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T01-37-25.127753.parquet"]}, {"split": "2024_01_16T02_41_01.393804", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T02-41-01.393804.parquet"]}, {"split": "2024_01_16T16_13_12.225780", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T16-13-12.225780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T16-13-12.225780.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_15T01_37_25.127753", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T01-37-25.127753.parquet"]}, {"split": "2024_01_16T02_41_01.393804", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T02-41-01.393804.parquet"]}, {"split": "2024_01_16T16_13_12.225780", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T16-13-12.225780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T16-13-12.225780.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_15T01_37_25.127753", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T01-37-25.127753.parquet"]}, {"split": "2024_01_16T02_41_01.393804", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T02-41-01.393804.parquet"]}, {"split": "2024_01_16T16_13_12.225780", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T16-13-12.225780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T16-13-12.225780.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_15T01_37_25.127753", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T01-37-25.127753.parquet"]}, {"split": "2024_01_16T02_41_01.393804", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T02-41-01.393804.parquet"]}, {"split": "2024_01_16T16_13_12.225780", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T16-13-12.225780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T16-13-12.225780.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_15T01_37_25.127753", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T01-37-25.127753.parquet"]}, {"split": "2024_01_16T02_41_01.393804", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T02-41-01.393804.parquet"]}, {"split": "2024_01_16T16_13_12.225780", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T16-13-12.225780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T16-13-12.225780.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_15T01_37_25.127753", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T01-37-25.127753.parquet"]}, {"split": "2024_01_16T02_41_01.393804", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T02-41-01.393804.parquet"]}, {"split": "2024_01_16T16_13_12.225780", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T16-13-12.225780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T16-13-12.225780.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_15T01_37_25.127753", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-15T01-37-25.127753.parquet"]}, {"split": "2024_01_16T02_41_01.393804", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-16T02-41-01.393804.parquet"]}, {"split": "2024_01_16T16_13_12.225780", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-16T16-13-12.225780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-16T16-13-12.225780.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_15T01_37_25.127753", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T01-37-25.127753.parquet"]}, {"split": "2024_01_16T02_41_01.393804", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T02-41-01.393804.parquet"]}, {"split": "2024_01_16T16_13_12.225780", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T16-13-12.225780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T16-13-12.225780.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_15T01_37_25.127753", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-15T01-37-25.127753.parquet"]}, {"split": "2024_01_16T02_41_01.393804", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-16T02-41-01.393804.parquet"]}, {"split": "2024_01_16T16_13_12.225780", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-16T16-13-12.225780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-16T16-13-12.225780.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_15T01_37_25.127753", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T01-37-25.127753.parquet"]}, {"split": "2024_01_16T02_41_01.393804", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T02-41-01.393804.parquet"]}, {"split": "2024_01_16T16_13_12.225780", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T16-13-12.225780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T16-13-12.225780.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_15T01_37_25.127753", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T01-37-25.127753.parquet"]}, {"split": "2024_01_16T02_41_01.393804", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T02-41-01.393804.parquet"]}, {"split": "2024_01_16T16_13_12.225780", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T16-13-12.225780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T16-13-12.225780.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_15T01_37_25.127753", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T01-37-25.127753.parquet"]}, {"split": "2024_01_16T02_41_01.393804", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T02-41-01.393804.parquet"]}, {"split": "2024_01_16T16_13_12.225780", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T16-13-12.225780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T16-13-12.225780.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_15T01_37_25.127753", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-15T01-37-25.127753.parquet"]}, {"split": "2024_01_16T02_41_01.393804", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-16T02-41-01.393804.parquet"]}, {"split": "2024_01_16T16_13_12.225780", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-16T16-13-12.225780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-16T16-13-12.225780.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_15T01_37_25.127753", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-15T01-37-25.127753.parquet"]}, {"split": "2024_01_16T02_41_01.393804", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-16T02-41-01.393804.parquet"]}, {"split": "2024_01_16T16_13_12.225780", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-16T16-13-12.225780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-16T16-13-12.225780.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_15T01_37_25.127753", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T01-37-25.127753.parquet"]}, {"split": "2024_01_16T02_41_01.393804", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T02-41-01.393804.parquet"]}, {"split": "2024_01_16T16_13_12.225780", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T16-13-12.225780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T16-13-12.225780.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_15T01_37_25.127753", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T01-37-25.127753.parquet"]}, {"split": "2024_01_16T02_41_01.393804", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T02-41-01.393804.parquet"]}, {"split": "2024_01_16T16_13_12.225780", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T16-13-12.225780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T16-13-12.225780.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_15T01_37_25.127753", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T01-37-25.127753.parquet"]}, {"split": "2024_01_16T02_41_01.393804", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T02-41-01.393804.parquet"]}, {"split": "2024_01_16T16_13_12.225780", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T16-13-12.225780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T16-13-12.225780.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_15T01_37_25.127753", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T01-37-25.127753.parquet"]}, {"split": "2024_01_16T02_41_01.393804", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T02-41-01.393804.parquet"]}, {"split": "2024_01_16T16_13_12.225780", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T16-13-12.225780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T16-13-12.225780.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_15T01_37_25.127753", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-15T01-37-25.127753.parquet"]}, {"split": "2024_01_16T02_41_01.393804", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-16T02-41-01.393804.parquet"]}, {"split": "2024_01_16T16_13_12.225780", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-16T16-13-12.225780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-16T16-13-12.225780.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_15T01_37_25.127753", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-15T01-37-25.127753.parquet"]}, {"split": "2024_01_16T02_41_01.393804", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-16T02-41-01.393804.parquet"]}, {"split": "2024_01_16T16_13_12.225780", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-16T16-13-12.225780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-16T16-13-12.225780.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_15T01_37_25.127753", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-15T01-37-25.127753.parquet"]}, {"split": "2024_01_16T02_41_01.393804", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-16T02-41-01.393804.parquet"]}, {"split": "2024_01_16T16_13_12.225780", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-16T16-13-12.225780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-16T16-13-12.225780.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_15T01_37_25.127753", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T01-37-25.127753.parquet"]}, {"split": "2024_01_16T02_41_01.393804", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T02-41-01.393804.parquet"]}, {"split": "2024_01_16T16_13_12.225780", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T16-13-12.225780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T16-13-12.225780.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_15T01_37_25.127753", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-15T01-37-25.127753.parquet"]}, {"split": "2024_01_16T02_41_01.393804", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-16T02-41-01.393804.parquet"]}, {"split": "2024_01_16T16_13_12.225780", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-16T16-13-12.225780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-16T16-13-12.225780.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_15T01_37_25.127753", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T01-37-25.127753.parquet"]}, {"split": "2024_01_16T02_41_01.393804", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T02-41-01.393804.parquet"]}, {"split": "2024_01_16T16_13_12.225780", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T16-13-12.225780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T16-13-12.225780.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_15T01_37_25.127753", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T01-37-25.127753.parquet"]}, {"split": "2024_01_16T02_41_01.393804", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T02-41-01.393804.parquet"]}, {"split": "2024_01_16T16_13_12.225780", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T16-13-12.225780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T16-13-12.225780.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_15T01_37_25.127753", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-15T01-37-25.127753.parquet"]}, {"split": "2024_01_16T02_41_01.393804", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-16T02-41-01.393804.parquet"]}, {"split": "2024_01_16T16_13_12.225780", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-16T16-13-12.225780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-16T16-13-12.225780.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_15T01_37_25.127753", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-15T01-37-25.127753.parquet"]}, {"split": "2024_01_16T02_41_01.393804", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-16T02-41-01.393804.parquet"]}, {"split": "2024_01_16T16_13_12.225780", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-16T16-13-12.225780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-16T16-13-12.225780.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_15T01_37_25.127753", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-15T01-37-25.127753.parquet"]}, {"split": "2024_01_16T02_41_01.393804", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-16T02-41-01.393804.parquet"]}, {"split": "2024_01_16T16_13_12.225780", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-16T16-13-12.225780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-16T16-13-12.225780.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_15T01_37_25.127753", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T01-37-25.127753.parquet"]}, {"split": "2024_01_16T02_41_01.393804", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T02-41-01.393804.parquet"]}, {"split": "2024_01_16T16_13_12.225780", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T16-13-12.225780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T16-13-12.225780.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_15T01_37_25.127753", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-15T01-37-25.127753.parquet"]}, {"split": "2024_01_16T02_41_01.393804", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-16T02-41-01.393804.parquet"]}, {"split": "2024_01_16T16_13_12.225780", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-16T16-13-12.225780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-16T16-13-12.225780.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_15T01_37_25.127753", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-15T01-37-25.127753.parquet"]}, {"split": "2024_01_16T02_41_01.393804", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-16T02-41-01.393804.parquet"]}, {"split": "2024_01_16T16_13_12.225780", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-16T16-13-12.225780.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-16T16-13-12.225780.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_15T01_37_25.127753", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-15T01-37-25.127753.parquet"]}, {"split": "2024_01_16T02_41_01.393804", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-16T02-41-01.393804.parquet"]}, {"split": "2024_01_16T16_13_12.225780", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-16T16-13-12.225780.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-16T16-13-12.225780.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_15T01_37_25.127753", "path": ["**/details_harness|winogrande|5_2024-01-15T01-37-25.127753.parquet"]}, {"split": "2024_01_16T02_41_01.393804", "path": ["**/details_harness|winogrande|5_2024-01-16T02-41-01.393804.parquet"]}, {"split": "2024_01_16T16_13_12.225780", "path": ["**/details_harness|winogrande|5_2024-01-16T16-13-12.225780.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-16T16-13-12.225780.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_15T01_37_25.127753", "path": ["results_2024-01-15T01-37-25.127753.parquet"]}, {"split": "2024_01_16T02_41_01.393804", "path": ["results_2024-01-16T02-41-01.393804.parquet"]}, {"split": "2024_01_16T16_13_12.225780", "path": ["results_2024-01-16T16-13-12.225780.parquet"]}, {"split": "latest", "path": ["results_2024-01-16T16-13-12.225780.parquet"]}]}]} | 2024-01-16T16:15:30+00:00 |
e463e76f3391a7212d7a7fbb3d3bbabf8e805c26 | flowersfromthefuture/F01 | [
"region:us"
] | 2024-01-15T01:45:08+00:00 | {} | 2024-01-15T01:45:20+00:00 |
|
832a9d92a6727255a1fd21e2ad172c21f5b03f72 | shokhjakhon/chat-koni-data | [
"size_categories:1K<n<10K",
"language:ru",
"license:apache-2.0",
"region:us"
] | 2024-01-15T01:45:10+00:00 | {"language": ["ru"], "license": "apache-2.0", "size_categories": ["1K<n<10K"], "pretty_name": "law-data by uzlegalai"} | 2024-01-15T01:50:15+00:00 |
|
2fc29e711cdba8b0b9651161670db0ac08db3a0d | sarahahatee/rhnd | [
"region:us"
] | 2024-01-15T01:59:04+00:00 | {} | 2024-01-15T02:01:39+00:00 |
|
e9708be65e63256c155dd0bf3027204ed6e60506 | Morning730/realisticVisionV51_v51VAE | [
"region:us"
] | 2024-01-15T02:04:55+00:00 | {} | 2024-01-15T02:08:45+00:00 |
|
c2607a491c843839b30837a307ac7c7d1f5ca4d9 | Navarro20/robin | [
"license:openrail",
"region:us"
] | 2024-01-15T02:10:04+00:00 | {"license": "openrail"} | 2024-01-15T02:10:52+00:00 |
|
999e51ff914ac831ea63560db9e797278b44a8a7 |
This is a dataset with explanations from ChatGPT for the correct and incorrect answers in CommonsenseQA. The explanations are generated by prompting ChatGPT with answer keys and in-context examples. We expect this dataset to be an useful source for understanding the commonsense reasoning ability of LLMs or training other LMs. | KomeijiForce/CommonsenseQA-Explained-by-ChatGPT | [
"task_categories:question-answering",
"size_categories:10K<n<100K",
"language:en",
"region:us"
] | 2024-01-15T02:13:59+00:00 | {"language": ["en"], "size_categories": ["10K<n<100K"], "task_categories": ["question-answering"]} | 2024-01-15T02:19:22+00:00 |
4191a2e3641c8e0894850568d1f5ee8b8f3ba7f9 |
This is a dataset with explanations from ChatGPT for the correct and incorrect answers in ARC-Easy. The explanations are generated by prompting ChatGPT with answer keys and in-context examples. We expect this dataset to be an useful source for understanding the commonsense reasoning ability of LLMs or training other LMs. | KomeijiForce/ARC-Easy-Explained-by-ChatGPT | [
"task_categories:question-answering",
"size_categories:1K<n<10K",
"language:en",
"region:us"
] | 2024-01-15T02:22:18+00:00 | {"language": ["en"], "size_categories": ["1K<n<10K"], "task_categories": ["question-answering"]} | 2024-01-15T02:27:45+00:00 |
ef4ae17eacd1db833a4c63e08cc6b85eaf34c374 | kevinliu0619/aaa123 | [
"region:us"
] | 2024-01-15T02:28:05+00:00 | {} | 2024-01-15T02:28:05+00:00 |
|
d90607dfedcf8e2a2cb562e75e5cd0f001bea8e2 |
This is a dataset with explanations from ChatGPT for the correct and incorrect answers in ARC Challenge. The explanations are generated by prompting ChatGPT with answer keys and in-context examples. We expect this dataset to be an useful source for understanding the commonsense reasoning ability of LLMs or training other LMs. | KomeijiForce/ARC-Challenge-Explained-by-ChatGPT | [
"task_categories:question-answering",
"size_categories:1K<n<10K",
"language:en",
"region:us"
] | 2024-01-15T02:28:36+00:00 | {"language": ["en"], "size_categories": ["1K<n<10K"], "task_categories": ["question-answering"]} | 2024-01-15T02:31:28+00:00 |
018d25f58895cf7acbb8698650d00db927a0a92c |
# Dataset Card for Evaluation run of rombodawg/Everyone-Coder-4x7b-Base
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [rombodawg/Everyone-Coder-4x7b-Base](https://huggingface.co/rombodawg/Everyone-Coder-4x7b-Base) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_rombodawg__Everyone-Coder-4x7b-Base",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-15T17:47:56.627468](https://huggingface.co/datasets/open-llm-leaderboard/details_rombodawg__Everyone-Coder-4x7b-Base/blob/main/results_2024-01-15T17-47-56.627468.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6447898132540958,
"acc_stderr": 0.031915985387073305,
"acc_norm": 0.6461876134084575,
"acc_norm_stderr": 0.03255592718009434,
"mc1": 0.3390452876376989,
"mc1_stderr": 0.016571797910626615,
"mc2": 0.49160643723765735,
"mc2_stderr": 0.015188709391608397
},
"harness|arc:challenge|25": {
"acc": 0.6117747440273038,
"acc_stderr": 0.014241614207414046,
"acc_norm": 0.6450511945392492,
"acc_norm_stderr": 0.013983036904094087
},
"harness|hellaswag|10": {
"acc": 0.6623182632941645,
"acc_stderr": 0.004719529099913131,
"acc_norm": 0.8481378211511651,
"acc_norm_stderr": 0.0035815378475817965
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595852,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595852
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.03669072477416907,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.03669072477416907
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932263,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932263
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5531914893617021,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.5531914893617021,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.04677473004491199,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.04677473004491199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.025379524910778408,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.025379524910778408
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.0436031486007746,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.0436031486007746
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7580645161290323,
"acc_stderr": 0.024362599693031096,
"acc_norm": 0.7580645161290323,
"acc_norm_stderr": 0.024362599693031096
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.793939393939394,
"acc_stderr": 0.031584153240477114,
"acc_norm": 0.793939393939394,
"acc_norm_stderr": 0.031584153240477114
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.02937661648494563,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.02937661648494563
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.02150024957603346,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.02150024957603346
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6461538461538462,
"acc_stderr": 0.024243783994062153,
"acc_norm": 0.6461538461538462,
"acc_norm_stderr": 0.024243783994062153
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.36666666666666664,
"acc_stderr": 0.029381620726465076,
"acc_norm": 0.36666666666666664,
"acc_norm_stderr": 0.029381620726465076
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.634453781512605,
"acc_stderr": 0.031282177063684614,
"acc_norm": 0.634453781512605,
"acc_norm_stderr": 0.031282177063684614
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526732,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526732
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8201834862385321,
"acc_stderr": 0.01646534546739154,
"acc_norm": 0.8201834862385321,
"acc_norm_stderr": 0.01646534546739154
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.03407632093854051,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.03407632093854051
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7794117647058824,
"acc_stderr": 0.02910225438967407,
"acc_norm": 0.7794117647058824,
"acc_norm_stderr": 0.02910225438967407
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601446,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601446
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057222,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057222
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159463,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159463
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.03520893951097652,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.03520893951097652
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7975460122699386,
"acc_stderr": 0.031570650789119005,
"acc_norm": 0.7975460122699386,
"acc_norm_stderr": 0.031570650789119005
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.8349514563106796,
"acc_stderr": 0.036756688322331886,
"acc_norm": 0.8349514563106796,
"acc_norm_stderr": 0.036756688322331886
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281382,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281382
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8212005108556832,
"acc_stderr": 0.013702643715368983,
"acc_norm": 0.8212005108556832,
"acc_norm_stderr": 0.013702643715368983
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.023618678310069356,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.023618678310069356
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3318435754189944,
"acc_stderr": 0.015748421208187303,
"acc_norm": 0.3318435754189944,
"acc_norm_stderr": 0.015748421208187303
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.02526169121972948,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.02526169121972948
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.025839898334877983,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.025839898334877983
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7561728395061729,
"acc_stderr": 0.023891879541959603,
"acc_norm": 0.7561728395061729,
"acc_norm_stderr": 0.023891879541959603
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.45827900912646674,
"acc_stderr": 0.01272570165695364,
"acc_norm": 0.45827900912646674,
"acc_norm_stderr": 0.01272570165695364
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.02833295951403121,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.02833295951403121
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.01897542792050721,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.01897542792050721
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7551020408163265,
"acc_stderr": 0.027529637440174923,
"acc_norm": 0.7551020408163265,
"acc_norm_stderr": 0.027529637440174923
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.028762349126466136,
"acc_norm": 0.91,
"acc_norm_stderr": 0.028762349126466136
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.847953216374269,
"acc_stderr": 0.027539122889061456,
"acc_norm": 0.847953216374269,
"acc_norm_stderr": 0.027539122889061456
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3390452876376989,
"mc1_stderr": 0.016571797910626615,
"mc2": 0.49160643723765735,
"mc2_stderr": 0.015188709391608397
},
"harness|winogrande|5": {
"acc": 0.7916337805840569,
"acc_stderr": 0.011414554399987729
},
"harness|gsm8k|5": {
"acc": 0.6345716451857468,
"acc_stderr": 0.013264282030266635
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_rombodawg__Everyone-Coder-4x7b-Base | [
"region:us"
] | 2024-01-15T02:39:40+00:00 | {"pretty_name": "Evaluation run of rombodawg/Everyone-Coder-4x7b-Base", "dataset_summary": "Dataset automatically created during the evaluation run of model [rombodawg/Everyone-Coder-4x7b-Base](https://huggingface.co/rombodawg/Everyone-Coder-4x7b-Base) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_rombodawg__Everyone-Coder-4x7b-Base\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-15T17:47:56.627468](https://huggingface.co/datasets/open-llm-leaderboard/details_rombodawg__Everyone-Coder-4x7b-Base/blob/main/results_2024-01-15T17-47-56.627468.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6447898132540958,\n \"acc_stderr\": 0.031915985387073305,\n \"acc_norm\": 0.6461876134084575,\n \"acc_norm_stderr\": 0.03255592718009434,\n \"mc1\": 0.3390452876376989,\n \"mc1_stderr\": 0.016571797910626615,\n \"mc2\": 0.49160643723765735,\n \"mc2_stderr\": 0.015188709391608397\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6117747440273038,\n \"acc_stderr\": 0.014241614207414046,\n \"acc_norm\": 0.6450511945392492,\n \"acc_norm_stderr\": 0.013983036904094087\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6623182632941645,\n \"acc_stderr\": 0.004719529099913131,\n \"acc_norm\": 0.8481378211511651,\n \"acc_norm_stderr\": 0.0035815378475817965\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n \"acc_stderr\": 0.04188307537595852,\n \"acc_norm\": 0.6222222222222222,\n \"acc_norm_stderr\": 0.04188307537595852\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n \"acc_stderr\": 0.03669072477416907,\n \"acc_norm\": 0.6358381502890174,\n \"acc_norm_stderr\": 0.03669072477416907\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932263\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5531914893617021,\n \"acc_stderr\": 0.0325005368436584,\n \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.0325005368436584\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778408,\n \"acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778408\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.0436031486007746,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.0436031486007746\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7580645161290323,\n \"acc_stderr\": 0.024362599693031096,\n \"acc_norm\": 0.7580645161290323,\n \"acc_norm_stderr\": 0.024362599693031096\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.031584153240477114,\n \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.031584153240477114\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7828282828282829,\n \"acc_stderr\": 0.02937661648494563,\n \"acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.02937661648494563\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.02150024957603346,\n \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.02150024957603346\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6461538461538462,\n \"acc_stderr\": 0.024243783994062153,\n \"acc_norm\": 0.6461538461538462,\n \"acc_norm_stderr\": 0.024243783994062153\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.36666666666666664,\n \"acc_stderr\": 0.029381620726465076,\n \"acc_norm\": 0.36666666666666664,\n \"acc_norm_stderr\": 0.029381620726465076\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.634453781512605,\n \"acc_stderr\": 0.031282177063684614,\n \"acc_norm\": 0.634453781512605,\n \"acc_norm_stderr\": 0.031282177063684614\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526732,\n \"acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526732\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8201834862385321,\n \"acc_stderr\": 0.01646534546739154,\n \"acc_norm\": 0.8201834862385321,\n \"acc_norm_stderr\": 0.01646534546739154\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5185185185185185,\n \"acc_stderr\": 0.03407632093854051,\n \"acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.03407632093854051\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7794117647058824,\n \"acc_stderr\": 0.02910225438967407,\n \"acc_norm\": 0.7794117647058824,\n \"acc_norm_stderr\": 0.02910225438967407\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601446,\n \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601446\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.03102441174057222,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.03102441174057222\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159463,\n \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159463\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8181818181818182,\n \"acc_stderr\": 0.03520893951097652,\n \"acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.03520893951097652\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7975460122699386,\n \"acc_stderr\": 0.031570650789119005,\n \"acc_norm\": 0.7975460122699386,\n \"acc_norm_stderr\": 0.031570650789119005\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8349514563106796,\n \"acc_stderr\": 0.036756688322331886,\n \"acc_norm\": 0.8349514563106796,\n \"acc_norm_stderr\": 0.036756688322331886\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n \"acc_stderr\": 0.021586494001281382,\n \"acc_norm\": 0.8760683760683761,\n \"acc_norm_stderr\": 0.021586494001281382\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8212005108556832,\n \"acc_stderr\": 0.013702643715368983,\n \"acc_norm\": 0.8212005108556832,\n \"acc_norm_stderr\": 0.013702643715368983\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069356,\n \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069356\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3318435754189944,\n \"acc_stderr\": 0.015748421208187303,\n \"acc_norm\": 0.3318435754189944,\n \"acc_norm_stderr\": 0.015748421208187303\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.02526169121972948,\n \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.02526169121972948\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n \"acc_stderr\": 0.025839898334877983,\n \"acc_norm\": 0.707395498392283,\n \"acc_norm_stderr\": 0.025839898334877983\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7561728395061729,\n \"acc_stderr\": 0.023891879541959603,\n \"acc_norm\": 0.7561728395061729,\n \"acc_norm_stderr\": 0.023891879541959603\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45827900912646674,\n \"acc_stderr\": 0.01272570165695364,\n \"acc_norm\": 0.45827900912646674,\n \"acc_norm_stderr\": 0.01272570165695364\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.02833295951403121,\n \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.02833295951403121\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.673202614379085,\n \"acc_stderr\": 0.01897542792050721,\n \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.01897542792050721\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7551020408163265,\n \"acc_stderr\": 0.027529637440174923,\n \"acc_norm\": 0.7551020408163265,\n \"acc_norm_stderr\": 0.027529637440174923\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466136,\n \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466136\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.027539122889061456,\n \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.027539122889061456\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3390452876376989,\n \"mc1_stderr\": 0.016571797910626615,\n \"mc2\": 0.49160643723765735,\n \"mc2_stderr\": 0.015188709391608397\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7916337805840569,\n \"acc_stderr\": 0.011414554399987729\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6345716451857468,\n \"acc_stderr\": 0.013264282030266635\n }\n}\n```", "repo_url": "https://huggingface.co/rombodawg/Everyone-Coder-4x7b-Base", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_15T02_37_27.677232", "path": ["**/details_harness|arc:challenge|25_2024-01-15T02-37-27.677232.parquet"]}, {"split": "2024_01_15T17_47_56.627468", "path": ["**/details_harness|arc:challenge|25_2024-01-15T17-47-56.627468.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-15T17-47-56.627468.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_15T02_37_27.677232", "path": ["**/details_harness|gsm8k|5_2024-01-15T02-37-27.677232.parquet"]}, {"split": "2024_01_15T17_47_56.627468", "path": ["**/details_harness|gsm8k|5_2024-01-15T17-47-56.627468.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-15T17-47-56.627468.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_15T02_37_27.677232", "path": ["**/details_harness|hellaswag|10_2024-01-15T02-37-27.677232.parquet"]}, {"split": "2024_01_15T17_47_56.627468", "path": ["**/details_harness|hellaswag|10_2024-01-15T17-47-56.627468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-15T17-47-56.627468.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_15T02_37_27.677232", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T02-37-27.677232.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-15T02-37-27.677232.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-15T02-37-27.677232.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T02-37-27.677232.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T02-37-27.677232.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-15T02-37-27.677232.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T02-37-27.677232.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T02-37-27.677232.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T02-37-27.677232.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T02-37-27.677232.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-15T02-37-27.677232.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-15T02-37-27.677232.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T02-37-27.677232.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-15T02-37-27.677232.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T02-37-27.677232.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T02-37-27.677232.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T02-37-27.677232.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-15T02-37-27.677232.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T02-37-27.677232.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T02-37-27.677232.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T02-37-27.677232.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T02-37-27.677232.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T02-37-27.677232.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T02-37-27.677232.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T02-37-27.677232.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T02-37-27.677232.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T02-37-27.677232.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T02-37-27.677232.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T02-37-27.677232.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T02-37-27.677232.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T02-37-27.677232.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T02-37-27.677232.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-15T02-37-27.677232.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T02-37-27.677232.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-15T02-37-27.677232.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T02-37-27.677232.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T02-37-27.677232.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T02-37-27.677232.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-15T02-37-27.677232.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-15T02-37-27.677232.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T02-37-27.677232.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T02-37-27.677232.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T02-37-27.677232.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T02-37-27.677232.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-15T02-37-27.677232.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-15T02-37-27.677232.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-15T02-37-27.677232.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T02-37-27.677232.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-15T02-37-27.677232.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T02-37-27.677232.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T02-37-27.677232.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-15T02-37-27.677232.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-15T02-37-27.677232.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-15T02-37-27.677232.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T02-37-27.677232.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-15T02-37-27.677232.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-15T02-37-27.677232.parquet"]}, {"split": "2024_01_15T17_47_56.627468", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-15T17-47-56.627468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-15T17-47-56.627468.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-15T17-47-56.627468.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_15T02_37_27.677232", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T02-37-27.677232.parquet"]}, {"split": "2024_01_15T17_47_56.627468", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T17-47-56.627468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T17-47-56.627468.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_15T02_37_27.677232", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-15T02-37-27.677232.parquet"]}, {"split": "2024_01_15T17_47_56.627468", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-15T17-47-56.627468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-15T17-47-56.627468.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_15T02_37_27.677232", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-15T02-37-27.677232.parquet"]}, {"split": "2024_01_15T17_47_56.627468", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-15T17-47-56.627468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-15T17-47-56.627468.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_15T02_37_27.677232", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T02-37-27.677232.parquet"]}, {"split": "2024_01_15T17_47_56.627468", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T17-47-56.627468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T17-47-56.627468.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_15T02_37_27.677232", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T02-37-27.677232.parquet"]}, {"split": "2024_01_15T17_47_56.627468", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T17-47-56.627468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T17-47-56.627468.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_15T02_37_27.677232", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-15T02-37-27.677232.parquet"]}, {"split": "2024_01_15T17_47_56.627468", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-15T17-47-56.627468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-15T17-47-56.627468.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_15T02_37_27.677232", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T02-37-27.677232.parquet"]}, {"split": "2024_01_15T17_47_56.627468", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T17-47-56.627468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T17-47-56.627468.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_15T02_37_27.677232", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T02-37-27.677232.parquet"]}, {"split": "2024_01_15T17_47_56.627468", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T17-47-56.627468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T17-47-56.627468.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_15T02_37_27.677232", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T02-37-27.677232.parquet"]}, {"split": "2024_01_15T17_47_56.627468", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T17-47-56.627468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T17-47-56.627468.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_15T02_37_27.677232", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T02-37-27.677232.parquet"]}, {"split": "2024_01_15T17_47_56.627468", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T17-47-56.627468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T17-47-56.627468.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_15T02_37_27.677232", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-15T02-37-27.677232.parquet"]}, {"split": "2024_01_15T17_47_56.627468", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-15T17-47-56.627468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-15T17-47-56.627468.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_15T02_37_27.677232", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-15T02-37-27.677232.parquet"]}, {"split": "2024_01_15T17_47_56.627468", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-15T17-47-56.627468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-15T17-47-56.627468.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_15T02_37_27.677232", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T02-37-27.677232.parquet"]}, {"split": "2024_01_15T17_47_56.627468", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T17-47-56.627468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T17-47-56.627468.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_15T02_37_27.677232", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-15T02-37-27.677232.parquet"]}, {"split": "2024_01_15T17_47_56.627468", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-15T17-47-56.627468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-15T17-47-56.627468.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_15T02_37_27.677232", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T02-37-27.677232.parquet"]}, {"split": "2024_01_15T17_47_56.627468", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T17-47-56.627468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T17-47-56.627468.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_15T02_37_27.677232", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T02-37-27.677232.parquet"]}, {"split": "2024_01_15T17_47_56.627468", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T17-47-56.627468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T17-47-56.627468.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_15T02_37_27.677232", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T02-37-27.677232.parquet"]}, {"split": "2024_01_15T17_47_56.627468", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T17-47-56.627468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T17-47-56.627468.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_15T02_37_27.677232", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-15T02-37-27.677232.parquet"]}, {"split": "2024_01_15T17_47_56.627468", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-15T17-47-56.627468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-15T17-47-56.627468.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_15T02_37_27.677232", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T02-37-27.677232.parquet"]}, {"split": "2024_01_15T17_47_56.627468", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T17-47-56.627468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T17-47-56.627468.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_15T02_37_27.677232", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T02-37-27.677232.parquet"]}, {"split": "2024_01_15T17_47_56.627468", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T17-47-56.627468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T17-47-56.627468.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_15T02_37_27.677232", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T02-37-27.677232.parquet"]}, {"split": "2024_01_15T17_47_56.627468", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T17-47-56.627468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T17-47-56.627468.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_15T02_37_27.677232", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T02-37-27.677232.parquet"]}, {"split": "2024_01_15T17_47_56.627468", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T17-47-56.627468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T17-47-56.627468.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_15T02_37_27.677232", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T02-37-27.677232.parquet"]}, {"split": "2024_01_15T17_47_56.627468", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T17-47-56.627468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T17-47-56.627468.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_15T02_37_27.677232", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T02-37-27.677232.parquet"]}, {"split": "2024_01_15T17_47_56.627468", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T17-47-56.627468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T17-47-56.627468.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_15T02_37_27.677232", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T02-37-27.677232.parquet"]}, {"split": "2024_01_15T17_47_56.627468", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T17-47-56.627468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T17-47-56.627468.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_15T02_37_27.677232", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T02-37-27.677232.parquet"]}, {"split": "2024_01_15T17_47_56.627468", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T17-47-56.627468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T17-47-56.627468.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_15T02_37_27.677232", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T02-37-27.677232.parquet"]}, {"split": "2024_01_15T17_47_56.627468", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T17-47-56.627468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T17-47-56.627468.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_15T02_37_27.677232", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T02-37-27.677232.parquet"]}, {"split": "2024_01_15T17_47_56.627468", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T17-47-56.627468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T17-47-56.627468.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_15T02_37_27.677232", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T02-37-27.677232.parquet"]}, {"split": "2024_01_15T17_47_56.627468", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T17-47-56.627468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T17-47-56.627468.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_15T02_37_27.677232", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T02-37-27.677232.parquet"]}, {"split": "2024_01_15T17_47_56.627468", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T17-47-56.627468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T17-47-56.627468.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_15T02_37_27.677232", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T02-37-27.677232.parquet"]}, {"split": "2024_01_15T17_47_56.627468", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T17-47-56.627468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T17-47-56.627468.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_15T02_37_27.677232", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T02-37-27.677232.parquet"]}, {"split": "2024_01_15T17_47_56.627468", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T17-47-56.627468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T17-47-56.627468.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_15T02_37_27.677232", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-15T02-37-27.677232.parquet"]}, {"split": "2024_01_15T17_47_56.627468", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-15T17-47-56.627468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-15T17-47-56.627468.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_15T02_37_27.677232", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T02-37-27.677232.parquet"]}, {"split": "2024_01_15T17_47_56.627468", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T17-47-56.627468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T17-47-56.627468.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_15T02_37_27.677232", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-15T02-37-27.677232.parquet"]}, {"split": "2024_01_15T17_47_56.627468", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-15T17-47-56.627468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-15T17-47-56.627468.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_15T02_37_27.677232", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T02-37-27.677232.parquet"]}, {"split": "2024_01_15T17_47_56.627468", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T17-47-56.627468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T17-47-56.627468.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_15T02_37_27.677232", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T02-37-27.677232.parquet"]}, {"split": "2024_01_15T17_47_56.627468", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T17-47-56.627468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T17-47-56.627468.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_15T02_37_27.677232", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T02-37-27.677232.parquet"]}, {"split": "2024_01_15T17_47_56.627468", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T17-47-56.627468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T17-47-56.627468.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_15T02_37_27.677232", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-15T02-37-27.677232.parquet"]}, {"split": "2024_01_15T17_47_56.627468", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-15T17-47-56.627468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-15T17-47-56.627468.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_15T02_37_27.677232", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-15T02-37-27.677232.parquet"]}, {"split": "2024_01_15T17_47_56.627468", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-15T17-47-56.627468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-15T17-47-56.627468.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_15T02_37_27.677232", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T02-37-27.677232.parquet"]}, {"split": "2024_01_15T17_47_56.627468", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T17-47-56.627468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T17-47-56.627468.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_15T02_37_27.677232", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T02-37-27.677232.parquet"]}, {"split": "2024_01_15T17_47_56.627468", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T17-47-56.627468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T17-47-56.627468.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_15T02_37_27.677232", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T02-37-27.677232.parquet"]}, {"split": "2024_01_15T17_47_56.627468", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T17-47-56.627468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T17-47-56.627468.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_15T02_37_27.677232", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T02-37-27.677232.parquet"]}, {"split": "2024_01_15T17_47_56.627468", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T17-47-56.627468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T17-47-56.627468.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_15T02_37_27.677232", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-15T02-37-27.677232.parquet"]}, {"split": "2024_01_15T17_47_56.627468", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-15T17-47-56.627468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-15T17-47-56.627468.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_15T02_37_27.677232", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-15T02-37-27.677232.parquet"]}, {"split": "2024_01_15T17_47_56.627468", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-15T17-47-56.627468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-15T17-47-56.627468.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_15T02_37_27.677232", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-15T02-37-27.677232.parquet"]}, {"split": "2024_01_15T17_47_56.627468", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-15T17-47-56.627468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-15T17-47-56.627468.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_15T02_37_27.677232", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T02-37-27.677232.parquet"]}, {"split": "2024_01_15T17_47_56.627468", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T17-47-56.627468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T17-47-56.627468.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_15T02_37_27.677232", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-15T02-37-27.677232.parquet"]}, {"split": "2024_01_15T17_47_56.627468", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-15T17-47-56.627468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-15T17-47-56.627468.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_15T02_37_27.677232", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T02-37-27.677232.parquet"]}, {"split": "2024_01_15T17_47_56.627468", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T17-47-56.627468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T17-47-56.627468.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_15T02_37_27.677232", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T02-37-27.677232.parquet"]}, {"split": "2024_01_15T17_47_56.627468", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T17-47-56.627468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T17-47-56.627468.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_15T02_37_27.677232", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-15T02-37-27.677232.parquet"]}, {"split": "2024_01_15T17_47_56.627468", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-15T17-47-56.627468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-15T17-47-56.627468.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_15T02_37_27.677232", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-15T02-37-27.677232.parquet"]}, {"split": "2024_01_15T17_47_56.627468", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-15T17-47-56.627468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-15T17-47-56.627468.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_15T02_37_27.677232", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-15T02-37-27.677232.parquet"]}, {"split": "2024_01_15T17_47_56.627468", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-15T17-47-56.627468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-15T17-47-56.627468.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_15T02_37_27.677232", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T02-37-27.677232.parquet"]}, {"split": "2024_01_15T17_47_56.627468", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T17-47-56.627468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T17-47-56.627468.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_15T02_37_27.677232", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-15T02-37-27.677232.parquet"]}, {"split": "2024_01_15T17_47_56.627468", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-15T17-47-56.627468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-15T17-47-56.627468.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_15T02_37_27.677232", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-15T02-37-27.677232.parquet"]}, {"split": "2024_01_15T17_47_56.627468", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-15T17-47-56.627468.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-15T17-47-56.627468.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_15T02_37_27.677232", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-15T02-37-27.677232.parquet"]}, {"split": "2024_01_15T17_47_56.627468", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-15T17-47-56.627468.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-15T17-47-56.627468.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_15T02_37_27.677232", "path": ["**/details_harness|winogrande|5_2024-01-15T02-37-27.677232.parquet"]}, {"split": "2024_01_15T17_47_56.627468", "path": ["**/details_harness|winogrande|5_2024-01-15T17-47-56.627468.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-15T17-47-56.627468.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_15T02_37_27.677232", "path": ["results_2024-01-15T02-37-27.677232.parquet"]}, {"split": "2024_01_15T17_47_56.627468", "path": ["results_2024-01-15T17-47-56.627468.parquet"]}, {"split": "latest", "path": ["results_2024-01-15T17-47-56.627468.parquet"]}]}]} | 2024-01-15T17:50:34+00:00 |
e72481391a699d2e233a3ebac76444cf648888c6 | # Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | ericanzdu/dtest | [
"task_categories:token-classification",
"task_categories:image-to-3d",
"task_ids:language-modeling",
"size_categories:1K<n<10K",
"biology",
"art",
"region:us"
] | 2024-01-15T02:39:56+00:00 | {"size_categories": ["1K<n<10K"], "task_categories": ["token-classification", "image-to-3d"], "task_ids": ["language-modeling", "image-resize"], "tags": ["biology", "art"]} | 2024-01-15T10:15:24+00:00 |
2c28f957aa40e766851ad7a3916367c3007d2724 |
# Dataset of kitashirakawa_chiyuri/北白河ちゆり (Touhou)
This is the dataset of kitashirakawa_chiyuri/北白河ちゆり (Touhou), containing 151 images and their tags.
The core tags of this character are `blonde_hair, twintails, hat, sailor_hat, yellow_eyes, white_headwear`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 151 | 130.51 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kitashirakawa_chiyuri_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 151 | 86.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kitashirakawa_chiyuri_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 299 | 171.97 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kitashirakawa_chiyuri_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 151 | 118.27 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kitashirakawa_chiyuri_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 299 | 225.87 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kitashirakawa_chiyuri_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/kitashirakawa_chiyuri_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 8 |  |  |  |  |  | 1girl, blue_sailor_collar, solo, white_shorts, midriff, navel, smile, open_mouth |
| 1 | 7 |  |  |  |  |  | 2girls, blue_sailor_collar, midriff, red_hair, short_hair, shorts, navel, folding_chair, smile |
| 2 | 7 |  |  |  |  |  | 1girl, blue_sailor_collar, medium_hair, sailor_shirt, solo, white_shirt, bangs, blue_neckerchief, blush, upper_body, looking_at_viewer, simple_background, anchor_symbol, happy, white_background, closed_mouth, grin, puffy_short_sleeves |
| 3 | 7 |  |  |  |  |  | 1girl, blue_sailor_collar, midriff, open_mouth, puffy_short_sleeves, sailor_shirt, solo, white_shirt, white_shorts, anchor_symbol, medium_hair, blue_neckerchief, navel, smile, stomach, blush, folding_chair, happy, looking_at_viewer |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blue_sailor_collar | solo | white_shorts | midriff | navel | smile | open_mouth | 2girls | red_hair | short_hair | shorts | folding_chair | medium_hair | sailor_shirt | white_shirt | bangs | blue_neckerchief | blush | upper_body | looking_at_viewer | simple_background | anchor_symbol | happy | white_background | closed_mouth | grin | puffy_short_sleeves | stomach |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------------|:-------|:---------------|:----------|:--------|:--------|:-------------|:---------|:-----------|:-------------|:---------|:----------------|:--------------|:---------------|:--------------|:--------|:-------------------|:--------|:-------------|:--------------------|:--------------------|:----------------|:--------|:-------------------|:---------------|:-------|:----------------------|:----------|
| 0 | 8 |  |  |  |  |  | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | | X | | | X | X | X | | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 2 | 7 |  |  |  |  |  | X | X | X | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | |
| 3 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | | | | | X | X | X | X | | X | X | | X | | X | X | | | | X | X |
| CyberHarem/kitashirakawa_chiyuri_touhou | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-15T02:43:20+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-15T03:28:11+00:00 |
8179067022ed804338618bb9e666ddffff500e84 | # Dataset Card for "python-github-code-instruct-filtered-5k"
This fine dataset [tomekkorbak/python-github-code](https://huggingface.co/datasets/tomekkorbak/python-github-code), filtered by scores greater than 0.03.
Feedback and additional columns generated through OpenAI and Cohere responses. | jtatman/python-github-code-instruct-filtered-5k | [
"task_categories:text-generation",
"task_categories:question-answering",
"task_categories:conversational",
"size_categories:1K<n<10K",
"language:en",
"license:apache-2.0",
"Python",
"Code",
"Github",
"region:us"
] | 2024-01-15T02:48:29+00:00 | {"language": ["en"], "license": "apache-2.0", "size_categories": ["1K<n<10K"], "task_categories": ["text-generation", "question-answering", "conversational"], "pretty_name": "github python filtered by score", "dataset_info": {"features": [{"name": "system", "dtype": "string"}, {"name": "instruction", "dtype": "string"}, {"name": "output", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 23926332, "num_examples": 4502}], "download_size": 9549180, "dataset_size": 23926332}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "tags": ["Python", "Code", "Github"]} | 2024-01-15T03:16:03+00:00 |
4a20eb1780a3b180934bb7c1b836b647f8d723cb |
# Dataset Card for Evaluation run of ewqr2130/alignment-handbook-zephyr-7b-sft-full-dpo-5e7-cont1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ewqr2130/alignment-handbook-zephyr-7b-sft-full-dpo-5e7-cont1](https://huggingface.co/ewqr2130/alignment-handbook-zephyr-7b-sft-full-dpo-5e7-cont1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ewqr2130__alignment-handbook-zephyr-7b-sft-full-dpo-5e7-cont1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-15T02:49:27.291692](https://huggingface.co/datasets/open-llm-leaderboard/details_ewqr2130__alignment-handbook-zephyr-7b-sft-full-dpo-5e7-cont1/blob/main/results_2024-01-15T02-49-27.291692.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6032096253875614,
"acc_stderr": 0.03321637816759657,
"acc_norm": 0.6097201219482176,
"acc_norm_stderr": 0.033909808173675136,
"mc1": 0.27906976744186046,
"mc1_stderr": 0.015702107090627904,
"mc2": 0.40550458795616723,
"mc2_stderr": 0.015282277248005289
},
"harness|arc:challenge|25": {
"acc": 0.5614334470989761,
"acc_stderr": 0.014500682618212865,
"acc_norm": 0.6023890784982935,
"acc_norm_stderr": 0.01430175222327954
},
"harness|hellaswag|10": {
"acc": 0.6253734315873332,
"acc_stderr": 0.00483037131784105,
"acc_norm": 0.8228440549691296,
"acc_norm_stderr": 0.003810203308901103
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353228,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353228
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.625,
"acc_stderr": 0.039397364351956274,
"acc_norm": 0.625,
"acc_norm_stderr": 0.039397364351956274
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6641509433962264,
"acc_stderr": 0.029067220146644826,
"acc_norm": 0.6641509433962264,
"acc_norm_stderr": 0.029067220146644826
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6805555555555556,
"acc_stderr": 0.038990736873573344,
"acc_norm": 0.6805555555555556,
"acc_norm_stderr": 0.038990736873573344
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.03703851193099521,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.03703851193099521
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.04897104952726366,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.04897104952726366
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932261,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932261
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5446808510638298,
"acc_stderr": 0.03255525359340355,
"acc_norm": 0.5446808510638298,
"acc_norm_stderr": 0.03255525359340355
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.40350877192982454,
"acc_stderr": 0.046151869625837026,
"acc_norm": 0.40350877192982454,
"acc_norm_stderr": 0.046151869625837026
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.04113914981189261,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.04113914981189261
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3941798941798942,
"acc_stderr": 0.025167982333894143,
"acc_norm": 0.3941798941798942,
"acc_norm_stderr": 0.025167982333894143
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.04325506042017086,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.04325506042017086
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7225806451612903,
"acc_stderr": 0.025470196835900055,
"acc_norm": 0.7225806451612903,
"acc_norm_stderr": 0.025470196835900055
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7212121212121212,
"acc_stderr": 0.035014387062967806,
"acc_norm": 0.7212121212121212,
"acc_norm_stderr": 0.035014387062967806
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7424242424242424,
"acc_stderr": 0.03115626951964684,
"acc_norm": 0.7424242424242424,
"acc_norm_stderr": 0.03115626951964684
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8031088082901554,
"acc_stderr": 0.028697873971860677,
"acc_norm": 0.8031088082901554,
"acc_norm_stderr": 0.028697873971860677
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5897435897435898,
"acc_stderr": 0.0249393139069408,
"acc_norm": 0.5897435897435898,
"acc_norm_stderr": 0.0249393139069408
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.028661201116524572,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.028661201116524572
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6554621848739496,
"acc_stderr": 0.030868682604121622,
"acc_norm": 0.6554621848739496,
"acc_norm_stderr": 0.030868682604121622
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658753,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658753
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8036697247706422,
"acc_stderr": 0.017030719339154343,
"acc_norm": 0.8036697247706422,
"acc_norm_stderr": 0.017030719339154343
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4675925925925926,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.4675925925925926,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.03132179803083291,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.03132179803083291
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7215189873417721,
"acc_stderr": 0.029178682304842548,
"acc_norm": 0.7215189873417721,
"acc_norm_stderr": 0.029178682304842548
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6547085201793722,
"acc_stderr": 0.03191100192835794,
"acc_norm": 0.6547085201793722,
"acc_norm_stderr": 0.03191100192835794
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.648854961832061,
"acc_stderr": 0.04186445163013751,
"acc_norm": 0.648854961832061,
"acc_norm_stderr": 0.04186445163013751
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.036401182719909456,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.036401182719909456
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7239263803680982,
"acc_stderr": 0.035123852837050475,
"acc_norm": 0.7239263803680982,
"acc_norm_stderr": 0.035123852837050475
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.024414947304543674,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.024414947304543674
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7841634738186463,
"acc_stderr": 0.01471168438613996,
"acc_norm": 0.7841634738186463,
"acc_norm_stderr": 0.01471168438613996
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7023121387283237,
"acc_stderr": 0.024617055388676996,
"acc_norm": 0.7023121387283237,
"acc_norm_stderr": 0.024617055388676996
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3776536312849162,
"acc_stderr": 0.016214148752136632,
"acc_norm": 0.3776536312849162,
"acc_norm_stderr": 0.016214148752136632
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6535947712418301,
"acc_stderr": 0.02724561304721536,
"acc_norm": 0.6535947712418301,
"acc_norm_stderr": 0.02724561304721536
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6945337620578779,
"acc_stderr": 0.026160584450140453,
"acc_norm": 0.6945337620578779,
"acc_norm_stderr": 0.026160584450140453
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6728395061728395,
"acc_stderr": 0.026105673861409828,
"acc_norm": 0.6728395061728395,
"acc_norm_stderr": 0.026105673861409828
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.45390070921985815,
"acc_stderr": 0.029700453247291467,
"acc_norm": 0.45390070921985815,
"acc_norm_stderr": 0.029700453247291467
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4211212516297262,
"acc_stderr": 0.012610325733489905,
"acc_norm": 0.4211212516297262,
"acc_norm_stderr": 0.012610325733489905
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6066176470588235,
"acc_stderr": 0.029674288281311155,
"acc_norm": 0.6066176470588235,
"acc_norm_stderr": 0.029674288281311155
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6127450980392157,
"acc_stderr": 0.019706875804085637,
"acc_norm": 0.6127450980392157,
"acc_norm_stderr": 0.019706875804085637
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5818181818181818,
"acc_stderr": 0.04724577405731572,
"acc_norm": 0.5818181818181818,
"acc_norm_stderr": 0.04724577405731572
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6857142857142857,
"acc_stderr": 0.029719329422417475,
"acc_norm": 0.6857142857142857,
"acc_norm_stderr": 0.029719329422417475
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8109452736318408,
"acc_stderr": 0.02768691358801301,
"acc_norm": 0.8109452736318408,
"acc_norm_stderr": 0.02768691358801301
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653693,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653693
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835817,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835817
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7953216374269005,
"acc_stderr": 0.030944459778533214,
"acc_norm": 0.7953216374269005,
"acc_norm_stderr": 0.030944459778533214
},
"harness|truthfulqa:mc|0": {
"mc1": 0.27906976744186046,
"mc1_stderr": 0.015702107090627904,
"mc2": 0.40550458795616723,
"mc2_stderr": 0.015282277248005289
},
"harness|winogrande|5": {
"acc": 0.771112865035517,
"acc_stderr": 0.011807360224025405
},
"harness|gsm8k|5": {
"acc": 0.2896133434420015,
"acc_stderr": 0.012493927348659629
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_ewqr2130__alignment-handbook-zephyr-7b-sft-full-dpo-5e7-cont1 | [
"region:us"
] | 2024-01-15T02:51:46+00:00 | {"pretty_name": "Evaluation run of ewqr2130/alignment-handbook-zephyr-7b-sft-full-dpo-5e7-cont1", "dataset_summary": "Dataset automatically created during the evaluation run of model [ewqr2130/alignment-handbook-zephyr-7b-sft-full-dpo-5e7-cont1](https://huggingface.co/ewqr2130/alignment-handbook-zephyr-7b-sft-full-dpo-5e7-cont1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ewqr2130__alignment-handbook-zephyr-7b-sft-full-dpo-5e7-cont1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-15T02:49:27.291692](https://huggingface.co/datasets/open-llm-leaderboard/details_ewqr2130__alignment-handbook-zephyr-7b-sft-full-dpo-5e7-cont1/blob/main/results_2024-01-15T02-49-27.291692.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6032096253875614,\n \"acc_stderr\": 0.03321637816759657,\n \"acc_norm\": 0.6097201219482176,\n \"acc_norm_stderr\": 0.033909808173675136,\n \"mc1\": 0.27906976744186046,\n \"mc1_stderr\": 0.015702107090627904,\n \"mc2\": 0.40550458795616723,\n \"mc2_stderr\": 0.015282277248005289\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5614334470989761,\n \"acc_stderr\": 0.014500682618212865,\n \"acc_norm\": 0.6023890784982935,\n \"acc_norm_stderr\": 0.01430175222327954\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6253734315873332,\n \"acc_stderr\": 0.00483037131784105,\n \"acc_norm\": 0.8228440549691296,\n \"acc_norm_stderr\": 0.003810203308901103\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n \"acc_stderr\": 0.04244633238353228,\n \"acc_norm\": 0.5925925925925926,\n \"acc_norm_stderr\": 0.04244633238353228\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.625,\n \"acc_stderr\": 0.039397364351956274,\n \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.039397364351956274\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6641509433962264,\n \"acc_stderr\": 0.029067220146644826,\n \"acc_norm\": 0.6641509433962264,\n \"acc_norm_stderr\": 0.029067220146644826\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6805555555555556,\n \"acc_stderr\": 0.038990736873573344,\n \"acc_norm\": 0.6805555555555556,\n \"acc_norm_stderr\": 0.038990736873573344\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n \"acc_stderr\": 0.03703851193099521,\n \"acc_norm\": 0.6184971098265896,\n \"acc_norm_stderr\": 0.03703851193099521\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.04897104952726366,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.04897104952726366\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932261,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932261\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5446808510638298,\n \"acc_stderr\": 0.03255525359340355,\n \"acc_norm\": 0.5446808510638298,\n \"acc_norm_stderr\": 0.03255525359340355\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.40350877192982454,\n \"acc_stderr\": 0.046151869625837026,\n \"acc_norm\": 0.40350877192982454,\n \"acc_norm_stderr\": 0.046151869625837026\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.04113914981189261,\n \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.04113914981189261\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3941798941798942,\n \"acc_stderr\": 0.025167982333894143,\n \"acc_norm\": 0.3941798941798942,\n \"acc_norm_stderr\": 0.025167982333894143\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7225806451612903,\n \"acc_stderr\": 0.025470196835900055,\n \"acc_norm\": 0.7225806451612903,\n \"acc_norm_stderr\": 0.025470196835900055\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7212121212121212,\n \"acc_stderr\": 0.035014387062967806,\n \"acc_norm\": 0.7212121212121212,\n \"acc_norm_stderr\": 0.035014387062967806\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7424242424242424,\n \"acc_stderr\": 0.03115626951964684,\n \"acc_norm\": 0.7424242424242424,\n \"acc_norm_stderr\": 0.03115626951964684\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8031088082901554,\n \"acc_stderr\": 0.028697873971860677,\n \"acc_norm\": 0.8031088082901554,\n \"acc_norm_stderr\": 0.028697873971860677\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5897435897435898,\n \"acc_stderr\": 0.0249393139069408,\n \"acc_norm\": 0.5897435897435898,\n \"acc_norm_stderr\": 0.0249393139069408\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3296296296296296,\n \"acc_stderr\": 0.028661201116524572,\n \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.028661201116524572\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6554621848739496,\n \"acc_stderr\": 0.030868682604121622,\n \"acc_norm\": 0.6554621848739496,\n \"acc_norm_stderr\": 0.030868682604121622\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658753,\n \"acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658753\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8036697247706422,\n \"acc_stderr\": 0.017030719339154343,\n \"acc_norm\": 0.8036697247706422,\n \"acc_norm_stderr\": 0.017030719339154343\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4675925925925926,\n \"acc_stderr\": 0.03402801581358966,\n \"acc_norm\": 0.4675925925925926,\n \"acc_norm_stderr\": 0.03402801581358966\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.03132179803083291,\n \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.03132179803083291\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7215189873417721,\n \"acc_stderr\": 0.029178682304842548,\n \"acc_norm\": 0.7215189873417721,\n \"acc_norm_stderr\": 0.029178682304842548\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6547085201793722,\n \"acc_stderr\": 0.03191100192835794,\n \"acc_norm\": 0.6547085201793722,\n \"acc_norm_stderr\": 0.03191100192835794\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.648854961832061,\n \"acc_stderr\": 0.04186445163013751,\n \"acc_norm\": 0.648854961832061,\n \"acc_norm_stderr\": 0.04186445163013751\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8016528925619835,\n \"acc_stderr\": 0.036401182719909456,\n \"acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.036401182719909456\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7239263803680982,\n \"acc_stderr\": 0.035123852837050475,\n \"acc_norm\": 0.7239263803680982,\n \"acc_norm_stderr\": 0.035123852837050475\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.024414947304543674,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.024414947304543674\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7841634738186463,\n \"acc_stderr\": 0.01471168438613996,\n \"acc_norm\": 0.7841634738186463,\n \"acc_norm_stderr\": 0.01471168438613996\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7023121387283237,\n \"acc_stderr\": 0.024617055388676996,\n \"acc_norm\": 0.7023121387283237,\n \"acc_norm_stderr\": 0.024617055388676996\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3776536312849162,\n \"acc_stderr\": 0.016214148752136632,\n \"acc_norm\": 0.3776536312849162,\n \"acc_norm_stderr\": 0.016214148752136632\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6535947712418301,\n \"acc_stderr\": 0.02724561304721536,\n \"acc_norm\": 0.6535947712418301,\n \"acc_norm_stderr\": 0.02724561304721536\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6945337620578779,\n \"acc_stderr\": 0.026160584450140453,\n \"acc_norm\": 0.6945337620578779,\n \"acc_norm_stderr\": 0.026160584450140453\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6728395061728395,\n \"acc_stderr\": 0.026105673861409828,\n \"acc_norm\": 0.6728395061728395,\n \"acc_norm_stderr\": 0.026105673861409828\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.45390070921985815,\n \"acc_stderr\": 0.029700453247291467,\n \"acc_norm\": 0.45390070921985815,\n \"acc_norm_stderr\": 0.029700453247291467\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4211212516297262,\n \"acc_stderr\": 0.012610325733489905,\n \"acc_norm\": 0.4211212516297262,\n \"acc_norm_stderr\": 0.012610325733489905\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6066176470588235,\n \"acc_stderr\": 0.029674288281311155,\n \"acc_norm\": 0.6066176470588235,\n \"acc_norm_stderr\": 0.029674288281311155\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6127450980392157,\n \"acc_stderr\": 0.019706875804085637,\n \"acc_norm\": 0.6127450980392157,\n \"acc_norm_stderr\": 0.019706875804085637\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5818181818181818,\n \"acc_stderr\": 0.04724577405731572,\n \"acc_norm\": 0.5818181818181818,\n \"acc_norm_stderr\": 0.04724577405731572\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6857142857142857,\n \"acc_stderr\": 0.029719329422417475,\n \"acc_norm\": 0.6857142857142857,\n \"acc_norm_stderr\": 0.029719329422417475\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8109452736318408,\n \"acc_stderr\": 0.02768691358801301,\n \"acc_norm\": 0.8109452736318408,\n \"acc_norm_stderr\": 0.02768691358801301\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653693,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653693\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.030944459778533214,\n \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.030944459778533214\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.27906976744186046,\n \"mc1_stderr\": 0.015702107090627904,\n \"mc2\": 0.40550458795616723,\n \"mc2_stderr\": 0.015282277248005289\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.771112865035517,\n \"acc_stderr\": 0.011807360224025405\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2896133434420015,\n \"acc_stderr\": 0.012493927348659629\n }\n}\n```", "repo_url": "https://huggingface.co/ewqr2130/alignment-handbook-zephyr-7b-sft-full-dpo-5e7-cont1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_15T02_49_27.291692", "path": ["**/details_harness|arc:challenge|25_2024-01-15T02-49-27.291692.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-15T02-49-27.291692.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_15T02_49_27.291692", "path": ["**/details_harness|gsm8k|5_2024-01-15T02-49-27.291692.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-15T02-49-27.291692.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_15T02_49_27.291692", "path": ["**/details_harness|hellaswag|10_2024-01-15T02-49-27.291692.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-15T02-49-27.291692.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_15T02_49_27.291692", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-15T02-49-27.291692.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-15T02-49-27.291692.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-15T02-49-27.291692.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_15T02_49_27.291692", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T02-49-27.291692.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T02-49-27.291692.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_15T02_49_27.291692", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-15T02-49-27.291692.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-15T02-49-27.291692.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_15T02_49_27.291692", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-15T02-49-27.291692.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-15T02-49-27.291692.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_15T02_49_27.291692", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T02-49-27.291692.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T02-49-27.291692.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_15T02_49_27.291692", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T02-49-27.291692.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T02-49-27.291692.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_15T02_49_27.291692", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-15T02-49-27.291692.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-15T02-49-27.291692.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_15T02_49_27.291692", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T02-49-27.291692.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T02-49-27.291692.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_15T02_49_27.291692", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T02-49-27.291692.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T02-49-27.291692.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_15T02_49_27.291692", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T02-49-27.291692.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T02-49-27.291692.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_15T02_49_27.291692", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T02-49-27.291692.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T02-49-27.291692.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_15T02_49_27.291692", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-15T02-49-27.291692.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-15T02-49-27.291692.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_15T02_49_27.291692", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-15T02-49-27.291692.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-15T02-49-27.291692.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_15T02_49_27.291692", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T02-49-27.291692.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T02-49-27.291692.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_15T02_49_27.291692", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-15T02-49-27.291692.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-15T02-49-27.291692.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_15T02_49_27.291692", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T02-49-27.291692.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T02-49-27.291692.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_15T02_49_27.291692", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T02-49-27.291692.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T02-49-27.291692.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_15T02_49_27.291692", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T02-49-27.291692.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T02-49-27.291692.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_15T02_49_27.291692", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-15T02-49-27.291692.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-15T02-49-27.291692.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_15T02_49_27.291692", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T02-49-27.291692.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T02-49-27.291692.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_15T02_49_27.291692", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T02-49-27.291692.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T02-49-27.291692.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_15T02_49_27.291692", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T02-49-27.291692.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T02-49-27.291692.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_15T02_49_27.291692", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T02-49-27.291692.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T02-49-27.291692.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_15T02_49_27.291692", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T02-49-27.291692.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T02-49-27.291692.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_15T02_49_27.291692", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T02-49-27.291692.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T02-49-27.291692.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_15T02_49_27.291692", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T02-49-27.291692.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T02-49-27.291692.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_15T02_49_27.291692", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T02-49-27.291692.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T02-49-27.291692.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_15T02_49_27.291692", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T02-49-27.291692.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T02-49-27.291692.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_15T02_49_27.291692", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T02-49-27.291692.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T02-49-27.291692.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_15T02_49_27.291692", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T02-49-27.291692.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T02-49-27.291692.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_15T02_49_27.291692", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T02-49-27.291692.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T02-49-27.291692.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_15T02_49_27.291692", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T02-49-27.291692.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T02-49-27.291692.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_15T02_49_27.291692", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T02-49-27.291692.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T02-49-27.291692.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_15T02_49_27.291692", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-15T02-49-27.291692.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-15T02-49-27.291692.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_15T02_49_27.291692", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T02-49-27.291692.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T02-49-27.291692.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_15T02_49_27.291692", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-15T02-49-27.291692.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-15T02-49-27.291692.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_15T02_49_27.291692", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T02-49-27.291692.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T02-49-27.291692.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_15T02_49_27.291692", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T02-49-27.291692.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T02-49-27.291692.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_15T02_49_27.291692", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T02-49-27.291692.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T02-49-27.291692.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_15T02_49_27.291692", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-15T02-49-27.291692.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-15T02-49-27.291692.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_15T02_49_27.291692", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-15T02-49-27.291692.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-15T02-49-27.291692.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_15T02_49_27.291692", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T02-49-27.291692.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T02-49-27.291692.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_15T02_49_27.291692", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T02-49-27.291692.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T02-49-27.291692.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_15T02_49_27.291692", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T02-49-27.291692.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T02-49-27.291692.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_15T02_49_27.291692", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T02-49-27.291692.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T02-49-27.291692.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_15T02_49_27.291692", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-15T02-49-27.291692.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-15T02-49-27.291692.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_15T02_49_27.291692", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-15T02-49-27.291692.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-15T02-49-27.291692.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_15T02_49_27.291692", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-15T02-49-27.291692.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-15T02-49-27.291692.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_15T02_49_27.291692", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T02-49-27.291692.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T02-49-27.291692.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_15T02_49_27.291692", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-15T02-49-27.291692.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-15T02-49-27.291692.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_15T02_49_27.291692", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T02-49-27.291692.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T02-49-27.291692.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_15T02_49_27.291692", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T02-49-27.291692.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T02-49-27.291692.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_15T02_49_27.291692", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-15T02-49-27.291692.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-15T02-49-27.291692.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_15T02_49_27.291692", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-15T02-49-27.291692.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-15T02-49-27.291692.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_15T02_49_27.291692", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-15T02-49-27.291692.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-15T02-49-27.291692.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_15T02_49_27.291692", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T02-49-27.291692.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T02-49-27.291692.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_15T02_49_27.291692", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-15T02-49-27.291692.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-15T02-49-27.291692.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_15T02_49_27.291692", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-15T02-49-27.291692.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-15T02-49-27.291692.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_15T02_49_27.291692", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-15T02-49-27.291692.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-15T02-49-27.291692.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_15T02_49_27.291692", "path": ["**/details_harness|winogrande|5_2024-01-15T02-49-27.291692.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-15T02-49-27.291692.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_15T02_49_27.291692", "path": ["results_2024-01-15T02-49-27.291692.parquet"]}, {"split": "latest", "path": ["results_2024-01-15T02-49-27.291692.parquet"]}]}]} | 2024-01-15T02:52:08+00:00 |
9bfc3588cba06f547a9cd4688800186fa8d542d5 | Syma25/llama5.1 | [
"region:us"
] | 2024-01-15T03:01:09+00:00 | {} | 2024-01-15T03:03:44+00:00 |
|
b3b2ccf1d20c09b68d951b9615fc5a22999a8b4e | argmaxinc/librispeech-debug | [
"region:us"
] | 2024-01-15T03:08:34+00:00 | {} | 2024-01-15T03:16:06+00:00 |
|
1e8c945ec30f813133744398a6819aa903c5b720 | blackriderrx/mini-platypus-1 | [
"region:us"
] | 2024-01-15T03:08:56+00:00 | {} | 2024-01-15T03:08:56+00:00 |
|
297a3d4ebc1f9d2d265fd8b255e3d70ce7257511 | TDK1st/Zz-L | [
"region:us"
] | 2024-01-15T03:10:38+00:00 | {} | 2024-01-15T03:10:38+00:00 |
|
0c9ac1ca64444107f80a8a08bccb626bc96fa476 | andersonbcdefg/MEDI-NQ-subset | [
"region:us"
] | 2024-01-15T03:12:35+00:00 | {"dataset_info": {"features": [{"name": "pos", "dtype": "string"}, {"name": "task_name", "dtype": "string"}, {"name": "neg", "dtype": "string"}, {"name": "query", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 72687459.72125435, "num_examples": 50000}], "download_size": 42277611, "dataset_size": 72687459.72125435}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-15T03:21:34+00:00 |
|
99062fec1c4512b005bb695a600f25b3ed440585 | #reddit demo datasets
| johncbertrand/reddit-demo | [
"region:us"
] | 2024-01-15T03:13:06+00:00 | {} | 2024-01-15T18:23:09+00:00 |
e46bb0b428cbf86bd69f2a35c7df5cb5e5fb35e4 |
# Dataset of sariel/サリエル (Touhou)
This is the dataset of sariel/サリエル (Touhou), containing 45 images and their tags.
The core tags of this character are `long_hair, wings, multiple_wings, angel_wings, very_long_hair, blue_hair, red_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:---------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 45 | 42.55 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sariel_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 45 | 29.62 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sariel_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 73 | 44.82 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sariel_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 45 | 39.60 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sariel_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 73 | 56.23 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sariel_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/sariel_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 8 |  |  |  |  |  | 1girl, solo, staff, closed_eyes, long_sleeves, blue_dress, smile |
| 1 | 6 |  |  |  |  |  | 1girl, blue_dress, long_sleeves, solo, breasts, closed_mouth, feathered_wings, looking_at_viewer, smile, white_wings, wide_sleeves, holding, angel, bangs, blush, staff, white_shirt |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | staff | closed_eyes | long_sleeves | blue_dress | smile | breasts | closed_mouth | feathered_wings | looking_at_viewer | white_wings | wide_sleeves | holding | angel | bangs | blush | white_shirt |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:--------------|:---------------|:-------------|:--------|:----------|:---------------|:------------------|:--------------------|:--------------|:---------------|:----------|:--------|:--------|:--------|:--------------|
| 0 | 8 |  |  |  |  |  | X | X | X | X | X | X | X | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/sariel_touhou | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-15T03:19:21+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-15T03:48:53+00:00 |
2c16b9c01d714a79e0d061570afadf1aa51e0de8 |
# Dataset of satsuki_rin/冴月麟 (Touhou)
This is the dataset of satsuki_rin/冴月麟 (Touhou), containing 10 images and their tags.
The core tags of this character are `blonde_hair, ribbon, bow, hair_bow, short_hair, yellow_eyes, hair_ornament, hair_ribbon, red_bow, red_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 10 | 9.68 MiB | [Download](https://huggingface.co/datasets/CyberHarem/satsuki_rin_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 10 | 5.41 MiB | [Download](https://huggingface.co/datasets/CyberHarem/satsuki_rin_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 15 | 8.07 MiB | [Download](https://huggingface.co/datasets/CyberHarem/satsuki_rin_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 10 | 8.22 MiB | [Download](https://huggingface.co/datasets/CyberHarem/satsuki_rin_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 15 | 12.60 MiB | [Download](https://huggingface.co/datasets/CyberHarem/satsuki_rin_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/satsuki_rin_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------|
| 0 | 10 |  |  |  |  |  | 1girl, solo, instrument, smile, long_sleeves, frills, skirt |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | instrument | smile | long_sleeves | frills | skirt |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:-------------|:--------|:---------------|:---------|:--------|
| 0 | 10 |  |  |  |  |  | X | X | X | X | X | X | X |
| CyberHarem/satsuki_rin_touhou | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-15T03:19:21+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-15T03:23:57+00:00 |
06ba7ddbeb61dbfeb89e948c4ae2dc181ad8c558 | argmaxinc/earnings22-debug | [
"region:us"
] | 2024-01-15T03:19:47+00:00 | {} | 2024-01-15T03:29:13+00:00 |
|
ae93d58655cdce5ab325dd1647b46218c1f0e296 |
# Dataset of sakata_nemuno/坂田ネムノ (Touhou)
This is the dataset of sakata_nemuno/坂田ネムノ (Touhou), containing 257 images and their tags.
The core tags of this character are `long_hair, red_eyes, grey_hair, breasts, wavy_hair, very_long_hair, large_breasts, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 257 | 275.41 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sakata_nemuno_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 257 | 173.46 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sakata_nemuno_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 557 | 342.25 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sakata_nemuno_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 257 | 251.98 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sakata_nemuno_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 557 | 457.81 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sakata_nemuno_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/sakata_nemuno_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 16 |  |  |  |  |  | 1girl, bare_shoulders, detached_sleeves, looking_at_viewer, multicolored_dress, nata_(tool), single_strap, solo, holding_weapon, orange_dress, yellow_dress, collarbone, simple_background, barefoot, closed_mouth, full_body, white_background, blue_sleeves, cleaver, medium_breasts, smile, standing |
| 1 | 5 |  |  |  |  |  | 1girl, barefoot, detached_sleeves, full_body, holding, looking_at_viewer, multicolored_dress, nata_(tool), single_strap, solo, bare_shoulders, open_mouth, weapon, blue_sleeves, smile, standing, yellow_dress |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | detached_sleeves | looking_at_viewer | multicolored_dress | nata_(tool) | single_strap | solo | holding_weapon | orange_dress | yellow_dress | collarbone | simple_background | barefoot | closed_mouth | full_body | white_background | blue_sleeves | cleaver | medium_breasts | smile | standing | holding | open_mouth | weapon |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:-------------------|:--------------------|:---------------------|:--------------|:---------------|:-------|:-----------------|:---------------|:---------------|:-------------|:--------------------|:-----------|:---------------|:------------|:-------------------|:---------------|:----------|:-----------------|:--------|:-----------|:----------|:-------------|:---------|
| 0 | 16 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | | | X | | | X | | X | | X | | | X | X | X | X | X |
| CyberHarem/sakata_nemuno_touhou | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-15T03:20:43+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-15T04:30:01+00:00 |
ec12f14d2ebb762b314e1108f2869739ea6d3c42 | Berzerker/neocr_dataset | [
"language:en",
"region:us"
] | 2024-01-15T03:21:15+00:00 | {"language": ["en"], "dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "output_json_dumpsed", "dtype": "string"}]}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/*.parquet"}]}]} | 2024-01-18T01:03:25+00:00 |
|
f88ccaa0534bb4cee5563efa595a558fa3489994 | andersonbcdefg/quora_triplets | [
"region:us"
] | 2024-01-15T03:24:48+00:00 | {"dataset_info": {"features": [{"name": "query", "dtype": "string"}, {"name": "pos", "dtype": "string"}, {"name": "neg", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 17575186, "num_examples": 101762}], "download_size": 10952253, "dataset_size": 17575186}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-15T03:47:25+00:00 |
|
04438b56d90500c083f3d42dd8fca604cd346f2b | Berzerker/wordart | [
"region:us"
] | 2024-01-15T03:26:14+00:00 | {} | 2024-01-15T03:26:14+00:00 |
|
9df78a2d7045b1b5f7aac6e2acebd22a1567f369 | iNeil77/commit-chronicle | [
"region:us"
] | 2024-01-15T03:28:20+00:00 | {"dataset_info": [{"config_name": "C", "features": [{"name": "author", "dtype": "int64"}, {"name": "date", "dtype": "string"}, {"name": "timezone", "dtype": "int64"}, {"name": "hash", "dtype": "string"}, {"name": "message", "dtype": "string"}, {"name": "mods", "list": [{"name": "change_type", "dtype": "string"}, {"name": "old_path", "dtype": "string"}, {"name": "new_path", "dtype": "string"}, {"name": "diff", "dtype": "string"}]}, {"name": "language", "dtype": "string"}, {"name": "license", "dtype": "string"}, {"name": "repo", "dtype": "string"}, {"name": "original_message", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1214269026.0285635, "num_examples": 309153}, {"name": "validation", "num_bytes": 220284785.83363256, "num_examples": 57970}, {"name": "test", "num_bytes": 148589006.99135485, "num_examples": 38340}], "download_size": 516619057, "dataset_size": 1583142818.853551}, {"config_name": "C++", "features": [{"name": "author", "dtype": "int64"}, {"name": "date", "dtype": "string"}, {"name": "timezone", "dtype": "int64"}, {"name": "hash", "dtype": "string"}, {"name": "message", "dtype": "string"}, {"name": "mods", "list": [{"name": "change_type", "dtype": "string"}, {"name": "old_path", "dtype": "string"}, {"name": "new_path", "dtype": "string"}, {"name": "diff", "dtype": "string"}]}, {"name": "language", "dtype": "string"}, {"name": "license", "dtype": "string"}, {"name": "repo", "dtype": "string"}, {"name": "original_message", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 3262697231.9482107, "num_examples": 830683}, {"name": "validation", "num_bytes": 766516575.1115581, "num_examples": 201716}, {"name": "test", "num_bytes": 479503779.0820391, "num_examples": 123725}], "download_size": 1779547046, "dataset_size": 4508717586.141808}, {"config_name": "Go", "features": [{"name": "author", "dtype": "int64"}, {"name": "date", "dtype": "string"}, {"name": "timezone", "dtype": "int64"}, {"name": "hash", "dtype": "string"}, {"name": "message", "dtype": "string"}, {"name": "mods", "list": [{"name": "change_type", "dtype": "string"}, {"name": "old_path", "dtype": "string"}, {"name": "new_path", "dtype": "string"}, {"name": "diff", "dtype": "string"}]}, {"name": "language", "dtype": "string"}, {"name": "license", "dtype": "string"}, {"name": "repo", "dtype": "string"}, {"name": "original_message", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 2639610249.9324474, "num_examples": 672045}, {"name": "validation", "num_bytes": 509022394.3687841, "num_examples": 133954}, {"name": "test", "num_bytes": 522034184.995527, "num_examples": 134699}], "download_size": 1392783035, "dataset_size": 3670666829.2967587}, {"config_name": "Objective-C", "features": [{"name": "author", "dtype": "int64"}, {"name": "date", "dtype": "string"}, {"name": "timezone", "dtype": "int64"}, {"name": "hash", "dtype": "string"}, {"name": "message", "dtype": "string"}, {"name": "mods", "list": [{"name": "change_type", "dtype": "string"}, {"name": "old_path", "dtype": "string"}, {"name": "new_path", "dtype": "string"}, {"name": "diff", "dtype": "string"}]}, {"name": "language", "dtype": "string"}, {"name": "license", "dtype": "string"}, {"name": "repo", "dtype": "string"}, {"name": "original_message", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 127717945.2224976, "num_examples": 32517}, {"name": "validation", "num_bytes": 4917172.897511136, "num_examples": 1294}, {"name": "test", "num_bytes": 29872823.836446613, "num_examples": 7708}], "download_size": 52374411, "dataset_size": 162507941.95645535}, {"config_name": "Python", "features": [{"name": "author", "dtype": "int64"}, {"name": "date", "dtype": "string"}, {"name": "timezone", "dtype": "int64"}, {"name": "hash", "dtype": "string"}, {"name": "message", "dtype": "string"}, {"name": "mods", "list": [{"name": "change_type", "dtype": "string"}, {"name": "old_path", "dtype": "string"}, {"name": "new_path", "dtype": "string"}, {"name": "diff", "dtype": "string"}]}, {"name": "language", "dtype": "string"}, {"name": "license", "dtype": "string"}, {"name": "repo", "dtype": "string"}, {"name": "original_message", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 5224487604.251047, "num_examples": 1330155}, {"name": "validation", "num_bytes": 807734947.9240026, "num_examples": 212563}, {"name": "test", "num_bytes": 958895166.8964008, "num_examples": 247421}], "download_size": 2161676583, "dataset_size": 6991117719.07145}, {"config_name": "Ruby", "features": [{"name": "author", "dtype": "int64"}, {"name": "date", "dtype": "string"}, {"name": "timezone", "dtype": "int64"}, {"name": "hash", "dtype": "string"}, {"name": "message", "dtype": "string"}, {"name": "mods", "list": [{"name": "change_type", "dtype": "string"}, {"name": "old_path", "dtype": "string"}, {"name": "new_path", "dtype": "string"}, {"name": "diff", "dtype": "string"}]}, {"name": "language", "dtype": "string"}, {"name": "license", "dtype": "string"}, {"name": "repo", "dtype": "string"}, {"name": "original_message", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 714516644.312079, "num_examples": 181916}, {"name": "validation", "num_bytes": 151664764.05368194, "num_examples": 39912}, {"name": "test", "num_bytes": 129571629.38815771, "num_examples": 33433}], "download_size": 243994774, "dataset_size": 995753037.7539186}, {"config_name": "Rust", "features": [{"name": "author", "dtype": "int64"}, {"name": "date", "dtype": "string"}, {"name": "timezone", "dtype": "int64"}, {"name": "hash", "dtype": "string"}, {"name": "message", "dtype": "string"}, {"name": "mods", "list": [{"name": "change_type", "dtype": "string"}, {"name": "old_path", "dtype": "string"}, {"name": "new_path", "dtype": "string"}, {"name": "diff", "dtype": "string"}]}, {"name": "language", "dtype": "string"}, {"name": "license", "dtype": "string"}, {"name": "repo", "dtype": "string"}, {"name": "original_message", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 942800148.1493574, "num_examples": 240037}, {"name": "validation", "num_bytes": 230993126.81136546, "num_examples": 60788}, {"name": "test", "num_bytes": 175047461.6269829, "num_examples": 45167}], "download_size": 541549356, "dataset_size": 1348840736.5877059}, {"config_name": "Swift", "features": [{"name": "author", "dtype": "int64"}, {"name": "date", "dtype": "string"}, {"name": "timezone", "dtype": "int64"}, {"name": "hash", "dtype": "string"}, {"name": "message", "dtype": "string"}, {"name": "mods", "list": [{"name": "change_type", "dtype": "string"}, {"name": "old_path", "dtype": "string"}, {"name": "new_path", "dtype": "string"}, {"name": "diff", "dtype": "string"}]}, {"name": "language", "dtype": "string"}, {"name": "license", "dtype": "string"}, {"name": "repo", "dtype": "string"}, {"name": "original_message", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 397776768.5968331, "num_examples": 101274}, {"name": "validation", "num_bytes": 107262008.79292645, "num_examples": 28227}, {"name": "test", "num_bytes": 34639763.81034767, "num_examples": 8938}], "download_size": 181314627, "dataset_size": 539678541.2001072}], "configs": [{"config_name": "C", "data_files": [{"split": "train", "path": "C/train-*"}, {"split": "validation", "path": "C/validation-*"}, {"split": "test", "path": "C/test-*"}]}, {"config_name": "C++", "data_files": [{"split": "train", "path": "C++/train-*"}, {"split": "validation", "path": "C++/validation-*"}, {"split": "test", "path": "C++/test-*"}]}, {"config_name": "Go", "data_files": [{"split": "train", "path": "Go/train-*"}, {"split": "validation", "path": "Go/validation-*"}, {"split": "test", "path": "Go/test-*"}]}, {"config_name": "Objective-C", "data_files": [{"split": "train", "path": "Objective-C/train-*"}, {"split": "validation", "path": "Objective-C/validation-*"}, {"split": "test", "path": "Objective-C/test-*"}]}, {"config_name": "Python", "data_files": [{"split": "train", "path": "Python/train-*"}, {"split": "validation", "path": "Python/validation-*"}, {"split": "test", "path": "Python/test-*"}]}, {"config_name": "Ruby", "data_files": [{"split": "train", "path": "Ruby/train-*"}, {"split": "validation", "path": "Ruby/validation-*"}, {"split": "test", "path": "Ruby/test-*"}]}, {"config_name": "Rust", "data_files": [{"split": "train", "path": "Rust/train-*"}, {"split": "validation", "path": "Rust/validation-*"}, {"split": "test", "path": "Rust/test-*"}]}, {"config_name": "Swift", "data_files": [{"split": "train", "path": "Swift/train-*"}, {"split": "validation", "path": "Swift/validation-*"}, {"split": "test", "path": "Swift/test-*"}]}]} | 2024-01-15T03:44:42+00:00 |
|
6ac1f78030da5e63c0c5bce326091feb70ab6418 | sdsadsada/Sisas | [
"region:us"
] | 2024-01-15T03:31:59+00:00 | {} | 2024-01-15T03:37:37+00:00 |
|
9929c1bac59a13bde83bd536dc921123c0c02776 | modelloosrvcc/Peppa | [
"license:openrail",
"region:us"
] | 2024-01-15T03:37:45+00:00 | {"license": "openrail"} | 2024-01-15T03:38:13+00:00 |
|
7223828f81316a5ff1bd3a1032ea7a4af196ac88 | erikbtx/AUDIOERIKVOZPRONTO | [
"license:openrail",
"region:us"
] | 2024-01-15T03:38:54+00:00 | {"license": "openrail"} | 2024-01-15T03:39:26+00:00 |
|
3b744919b354e074dacf1a8e172d7457f5a3ce88 | tqgminh/llava_instruct | [
"region:us"
] | 2024-01-15T03:48:26+00:00 | {} | 2024-01-15T03:48:26+00:00 |
|
c13b946c3b8a1c41368346bcbd9a811ccebac8cc | ura-hcmut/vmlu_vi | [
"size_categories:1K<n<10K",
"language:vi",
"region:us"
] | 2024-01-15T03:53:03+00:00 | {"language": ["vi"], "size_categories": ["1K<n<10K"], "configs": [{"config_name": "default", "data_files": [{"split": "test", "path": "test.jsonl"}, {"split": "valid", "path": "valid.jsonl"}, {"split": "dev", "path": "dev.jsonl"}]}]} | 2024-01-15T03:54:46+00:00 |
|
fbfa963979c8a030daf58c6f156d20a41fb80cec | Domenic091/VOCAL-APENAS2 | [
"license:openrail",
"region:us"
] | 2024-01-15T03:53:55+00:00 | {"license": "openrail"} | 2024-01-15T03:54:07+00:00 |
|
74c6189302bacab7d3d53435819bb6af554f48eb | Singularity4-2/goblet | [
"region:us"
] | 2024-01-15T03:58:15+00:00 | {"dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "label", "dtype": "image"}], "splits": [{"name": "train", "num_bytes": 8449447.0, "num_examples": 200}, {"name": "validation", "num_bytes": 965072.0, "num_examples": 23}], "download_size": 9419595, "dataset_size": 9414519.0}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}]} | 2024-01-20T04:59:23+00:00 |
|
da85ba5da4d1fd76dc378c9cff9c9fe648206c11 | llm-aes/asappp-1-2-instruct | [
"region:us"
] | 2024-01-15T04:02:43+00:00 | {"dataset_info": {"features": [{"name": "id", "dtype": "int64"}, {"name": "input", "dtype": "string"}, {"name": "output", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 29451763, "num_examples": 7166}], "download_size": 8644011, "dataset_size": 29451763}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-15T04:03:19+00:00 |
|
a4ce198c7fdc90b23af0ec718a68bf7894d887b6 | lowres/eggy | [
"region:us"
] | 2024-01-15T04:14:46+00:00 | {"dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 70619137.86519945, "num_examples": 138}], "download_size": 76957609, "dataset_size": 70619137.86519945}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-15T14:54:49+00:00 |
|
9e9d251270b2bde8338b3198aae6ccf7bc23b9e9 | zedamangas/MiniNoia | [
"license:openrail",
"region:us"
] | 2024-01-15T04:17:25+00:00 | {"license": "openrail"} | 2024-01-15T04:22:44+00:00 |
|
794d3582ccf0c3ab4f7734a4027df58ad47fcbe1 | iNeil77/the-vault-function | [
"region:us"
] | 2024-01-15T04:19:17+00:00 | {"dataset_info": [{"config_name": "c", "features": [{"name": "hexsha", "dtype": "string"}, {"name": "repo", "dtype": "string"}, {"name": "path", "dtype": "string"}, {"name": "license", "sequence": "string"}, {"name": "language", "dtype": "string"}, {"name": "identifier", "dtype": "string"}, {"name": "return_type", "dtype": "string"}, {"name": "original_string", "dtype": "string"}, {"name": "original_docstring", "dtype": "string"}, {"name": "docstring", "dtype": "string"}, {"name": "docstring_tokens", "sequence": "string"}, {"name": "code", "dtype": "string"}, {"name": "code_tokens", "sequence": "string"}, {"name": "short_docstring", "dtype": "string"}, {"name": "short_docstring_tokens", "sequence": "string"}, {"name": "comment", "sequence": "string"}, {"name": "parameters", "list": [{"name": "param", "dtype": "string"}, {"name": "type", "dtype": "string"}]}, {"name": "docstring_params", "struct": [{"name": "returns", "list": [{"name": "docstring", "dtype": "string"}, {"name": "docstring_tokens", "sequence": "string"}, {"name": "type", "dtype": "string"}]}, {"name": "raises", "list": [{"name": "docstring", "dtype": "string"}, {"name": "docstring_tokens", "sequence": "string"}, {"name": "type", "dtype": "string"}]}, {"name": "params", "list": [{"name": "identifier", "dtype": "string"}, {"name": "type", "dtype": "string"}, {"name": "docstring", "dtype": "string"}, {"name": "docstring_tokens", "sequence": "string"}, {"name": "default", "dtype": "string"}, {"name": "is_optional", "dtype": "bool"}]}, {"name": "outlier_params", "list": [{"name": "identifier", "dtype": "string"}, {"name": "type", "dtype": "string"}, {"name": "docstring", "dtype": "string"}, {"name": "docstring_tokens", "sequence": "string"}, {"name": "default", "dtype": "string"}, {"name": "is_optional", "dtype": "bool"}]}, {"name": "others", "list": [{"name": "identifier", "dtype": "string"}, {"name": "docstring", "dtype": "string"}, {"name": "docstring_tokens", "sequence": "string"}]}]}], "splits": [{"name": "train", "num_bytes": 1618612526, "num_examples": 381207}, {"name": "validation", "num_bytes": 118163214, "num_examples": 27525}, {"name": "test", "num_bytes": 82244493, "num_examples": 19122}], "download_size": 601549243, "dataset_size": 1819020233}, {"config_name": "cpp", "features": [{"name": "hexsha", "dtype": "string"}, {"name": "repo", "dtype": "string"}, {"name": "path", "dtype": "string"}, {"name": "license", "sequence": "string"}, {"name": "language", "dtype": "string"}, {"name": "identifier", "dtype": "string"}, {"name": "return_type", "dtype": "string"}, {"name": "original_string", "dtype": "string"}, {"name": "original_docstring", "dtype": "string"}, {"name": "docstring", "dtype": "string"}, {"name": "docstring_tokens", "sequence": "string"}, {"name": "code", "dtype": "string"}, {"name": "code_tokens", "sequence": "string"}, {"name": "short_docstring", "dtype": "string"}, {"name": "short_docstring_tokens", "sequence": "string"}, {"name": "comment", "sequence": "string"}, {"name": "parameters", "list": [{"name": "param", "dtype": "string"}, {"name": "type", "dtype": "string"}]}, {"name": "docstring_params", "struct": [{"name": "returns", "list": [{"name": "docstring", "dtype": "string"}, {"name": "docstring_tokens", "sequence": "string"}, {"name": "type", "dtype": "string"}]}, {"name": "raises", "list": [{"name": "docstring", "dtype": "string"}, {"name": "docstring_tokens", "sequence": "string"}, {"name": "type", "dtype": "string"}]}, {"name": "params", "list": [{"name": "identifier", "dtype": "string"}, {"name": "type", "dtype": "string"}, {"name": "docstring", "dtype": "string"}, {"name": "docstring_tokens", "sequence": "string"}, {"name": "default", "dtype": "string"}, {"name": "is_optional", "dtype": "bool"}]}, {"name": "outlier_params", "list": [{"name": "identifier", "dtype": "string"}, {"name": "type", "dtype": "string"}, {"name": "docstring", "dtype": "string"}, {"name": "docstring_tokens", "sequence": "string"}, {"name": "default", "dtype": "string"}, {"name": "is_optional", "dtype": "bool"}]}, {"name": "others", "list": [{"name": "identifier", "dtype": "string"}, {"name": "docstring", "dtype": "string"}, {"name": "docstring_tokens", "sequence": "string"}]}]}], "splits": [{"name": "train", "num_bytes": 1745583444, "num_examples": 410907}, {"name": "validation", "num_bytes": 85254767, "num_examples": 20011}, {"name": "test", "num_bytes": 71686667, "num_examples": 18169}], "download_size": 617392067, "dataset_size": 1902524878}, {"config_name": "go", "features": [{"name": "hexsha", "dtype": "string"}, {"name": "repo", "dtype": "string"}, {"name": "path", "dtype": "string"}, {"name": "license", "sequence": "string"}, {"name": "language", "dtype": "string"}, {"name": "identifier", "dtype": "string"}, {"name": "return_type", "dtype": "string"}, {"name": "original_string", "dtype": "string"}, {"name": "original_docstring", "dtype": "string"}, {"name": "docstring", "dtype": "string"}, {"name": "docstring_tokens", "sequence": "string"}, {"name": "code", "dtype": "string"}, {"name": "code_tokens", "sequence": "string"}, {"name": "short_docstring", "dtype": "string"}, {"name": "short_docstring_tokens", "sequence": "string"}, {"name": "comment", "sequence": "string"}, {"name": "parameters", "list": [{"name": "param", "dtype": "string"}, {"name": "type", "dtype": "string"}]}, {"name": "docstring_params", "struct": [{"name": "returns", "list": [{"name": "docstring", "dtype": "string"}, {"name": "docstring_tokens", "sequence": "string"}, {"name": "type", "dtype": "string"}]}, {"name": "raises", "list": [{"name": "docstring", "dtype": "string"}, {"name": "docstring_tokens", "sequence": "string"}, {"name": "type", "dtype": "string"}]}, {"name": "params", "list": [{"name": "identifier", "dtype": "string"}, {"name": "type", "dtype": "string"}, {"name": "docstring", "dtype": "string"}, {"name": "docstring_tokens", "sequence": "string"}, {"name": "default", "dtype": "string"}, {"name": "is_optional", "dtype": "bool"}]}, {"name": "outlier_params", "list": [{"name": "identifier", "dtype": "string"}, {"name": "type", "dtype": "string"}, {"name": "docstring", "dtype": "string"}, {"name": "docstring_tokens", "sequence": "string"}, {"name": "default", "dtype": "string"}, {"name": "is_optional", "dtype": "bool"}]}, {"name": "others", "list": [{"name": "identifier", "dtype": "string"}, {"name": "docstring", "dtype": "string"}, {"name": "docstring_tokens", "sequence": "string"}]}]}], "splits": [{"name": "train", "num_bytes": 3717971602, "num_examples": 1319547}, {"name": "validation", "num_bytes": 50699286, "num_examples": 19102}, {"name": "test", "num_bytes": 71810505, "num_examples": 25314}], "download_size": 1052043326, "dataset_size": 3840481393}, {"config_name": "python", "features": [{"name": "hexsha", "dtype": "string"}, {"name": "repo", "dtype": "string"}, {"name": "path", "dtype": "string"}, {"name": "license", "sequence": "string"}, {"name": "language", "dtype": "string"}, {"name": "identifier", "dtype": "string"}, {"name": "return_type", "dtype": "string"}, {"name": "original_string", "dtype": "string"}, {"name": "original_docstring", "dtype": "string"}, {"name": "docstring", "dtype": "string"}, {"name": "docstring_tokens", "sequence": "string"}, {"name": "code", "dtype": "string"}, {"name": "code_tokens", "sequence": "string"}, {"name": "short_docstring", "dtype": "string"}, {"name": "short_docstring_tokens", "sequence": "string"}, {"name": "comment", "sequence": "string"}, {"name": "parameters", "list": [{"name": "param", "dtype": "string"}, {"name": "type", "dtype": "string"}]}, {"name": "docstring_params", "struct": [{"name": "returns", "list": [{"name": "docstring", "dtype": "string"}, {"name": "docstring_tokens", "sequence": "string"}, {"name": "type", "dtype": "string"}]}, {"name": "raises", "list": [{"name": "docstring", "dtype": "string"}, {"name": "docstring_tokens", "sequence": "string"}, {"name": "type", "dtype": "string"}]}, {"name": "params", "list": [{"name": "identifier", "dtype": "string"}, {"name": "type", "dtype": "string"}, {"name": "docstring", "dtype": "string"}, {"name": "docstring_tokens", "sequence": "string"}, {"name": "default", "dtype": "string"}, {"name": "is_optional", "dtype": "bool"}]}, {"name": "outlier_params", "list": [{"name": "identifier", "dtype": "string"}, {"name": "type", "dtype": "string"}, {"name": "docstring", "dtype": "string"}, {"name": "docstring_tokens", "sequence": "string"}, {"name": "default", "dtype": "string"}, {"name": "is_optional", "dtype": "bool"}]}, {"name": "others", "list": [{"name": "identifier", "dtype": "string"}, {"name": "docstring", "dtype": "string"}, {"name": "docstring_tokens", "sequence": "string"}]}]}], "splits": [{"name": "train", "num_bytes": 8545493683, "num_examples": 1952110}, {"name": "validation", "num_bytes": 110572316, "num_examples": 30992}, {"name": "test", "num_bytes": 94502917, "num_examples": 21652}], "download_size": 2953145655, "dataset_size": 8750568916}, {"config_name": "ruby", "features": [{"name": "hexsha", "dtype": "string"}, {"name": "repo", "dtype": "string"}, {"name": "path", "dtype": "string"}, {"name": "license", "sequence": "string"}, {"name": "language", "dtype": "string"}, {"name": "identifier", "dtype": "string"}, {"name": "return_type", "dtype": "string"}, {"name": "original_string", "dtype": "string"}, {"name": "original_docstring", "dtype": "string"}, {"name": "docstring", "dtype": "string"}, {"name": "docstring_tokens", "sequence": "string"}, {"name": "code", "dtype": "string"}, {"name": "code_tokens", "sequence": "string"}, {"name": "short_docstring", "dtype": "string"}, {"name": "short_docstring_tokens", "sequence": "string"}, {"name": "comment", "sequence": "string"}, {"name": "parameters", "list": [{"name": "param", "dtype": "string"}, {"name": "type", "dtype": "string"}]}, {"name": "docstring_params", "struct": [{"name": "returns", "list": [{"name": "docstring", "dtype": "string"}, {"name": "docstring_tokens", "sequence": "string"}, {"name": "type", "dtype": "string"}]}, {"name": "raises", "list": [{"name": "docstring", "dtype": "string"}, {"name": "docstring_tokens", "sequence": "string"}, {"name": "type", "dtype": "string"}]}, {"name": "params", "list": [{"name": "identifier", "dtype": "string"}, {"name": "type", "dtype": "string"}, {"name": "docstring", "dtype": "string"}, {"name": "docstring_tokens", "sequence": "string"}, {"name": "default", "dtype": "string"}, {"name": "is_optional", "dtype": "bool"}]}, {"name": "outlier_params", "list": [{"name": "identifier", "dtype": "string"}, {"name": "type", "dtype": "string"}, {"name": "docstring", "dtype": "string"}, {"name": "docstring_tokens", "sequence": "string"}, {"name": "default", "dtype": "string"}, {"name": "is_optional", "dtype": "bool"}]}, {"name": "others", "list": [{"name": "identifier", "dtype": "string"}, {"name": "docstring", "dtype": "string"}, {"name": "docstring_tokens", "sequence": "string"}]}]}], "splits": [{"name": "train", "num_bytes": 358470286, "num_examples": 112574}, {"name": "validation", "num_bytes": 51183541, "num_examples": 17338}, {"name": "test", "num_bytes": 64582951, "num_examples": 19908}], "download_size": 157505004, "dataset_size": 474236778}, {"config_name": "rust", "features": [{"name": "hexsha", "dtype": "string"}, {"name": "repo", "dtype": "string"}, {"name": "path", "dtype": "string"}, {"name": "license", "sequence": "string"}, {"name": "language", "dtype": "string"}, {"name": "identifier", "dtype": "string"}, {"name": "return_type", "dtype": "string"}, {"name": "original_string", "dtype": "string"}, {"name": "original_docstring", "dtype": "string"}, {"name": "docstring", "dtype": "string"}, {"name": "docstring_tokens", "sequence": "string"}, {"name": "code", "dtype": "string"}, {"name": "code_tokens", "sequence": "string"}, {"name": "short_docstring", "dtype": "string"}, {"name": "short_docstring_tokens", "sequence": "string"}, {"name": "comment", "sequence": "string"}, {"name": "parameters", "list": [{"name": "param", "dtype": "string"}, {"name": "type", "dtype": "string"}]}, {"name": "docstring_params", "struct": [{"name": "returns", "list": [{"name": "docstring", "dtype": "string"}, {"name": "docstring_tokens", "sequence": "string"}, {"name": "type", "dtype": "string"}]}, {"name": "raises", "list": [{"name": "docstring", "dtype": "string"}, {"name": "docstring_tokens", "sequence": "string"}, {"name": "type", "dtype": "string"}]}, {"name": "params", "list": [{"name": "identifier", "dtype": "string"}, {"name": "type", "dtype": "string"}, {"name": "docstring", "dtype": "string"}, {"name": "docstring_tokens", "sequence": "string"}, {"name": "default", "dtype": "string"}, {"name": "is_optional", "dtype": "bool"}]}, {"name": "outlier_params", "list": [{"name": "identifier", "dtype": "string"}, {"name": "type", "dtype": "string"}, {"name": "docstring", "dtype": "string"}, {"name": "docstring_tokens", "sequence": "string"}, {"name": "default", "dtype": "string"}, {"name": "is_optional", "dtype": "bool"}]}, {"name": "others", "list": [{"name": "identifier", "dtype": "string"}, {"name": "docstring", "dtype": "string"}, {"name": "docstring_tokens", "sequence": "string"}]}]}], "splits": [{"name": "train", "num_bytes": 730827968, "num_examples": 224015}, {"name": "validation", "num_bytes": 60404939, "num_examples": 16716}, {"name": "test", "num_bytes": 87319651, "num_examples": 23141}], "download_size": 279796696, "dataset_size": 878552558}], "configs": [{"config_name": "c", "data_files": [{"split": "train", "path": "c/train-*"}, {"split": "validation", "path": "c/validation-*"}, {"split": "test", "path": "c/test-*"}]}, {"config_name": "cpp", "data_files": [{"split": "train", "path": "cpp/train-*"}, {"split": "validation", "path": "cpp/validation-*"}, {"split": "test", "path": "cpp/test-*"}]}, {"config_name": "go", "data_files": [{"split": "train", "path": "go/train-*"}, {"split": "validation", "path": "go/validation-*"}, {"split": "test", "path": "go/test-*"}]}, {"config_name": "python", "data_files": [{"split": "train", "path": "python/train-*"}, {"split": "validation", "path": "python/validation-*"}, {"split": "test", "path": "python/test-*"}]}, {"config_name": "ruby", "data_files": [{"split": "train", "path": "ruby/train-*"}, {"split": "validation", "path": "ruby/validation-*"}, {"split": "test", "path": "ruby/test-*"}]}, {"config_name": "rust", "data_files": [{"split": "train", "path": "rust/train-*"}, {"split": "validation", "path": "rust/validation-*"}, {"split": "test", "path": "rust/test-*"}]}]} | 2024-01-15T07:54:41+00:00 |
|
134be630da19d09ff1ae1675499690b3ba8ef17c |
# Dataset of kotohime/ことひめ/小兎姫 (Touhou)
This is the dataset of kotohime/ことひめ/小兎姫 (Touhou), containing 78 images and their tags.
The core tags of this character are `long_hair, red_hair, red_eyes, bow, hair_bow`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 78 | 65.24 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kotohime_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 78 | 46.11 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kotohime_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 142 | 79.75 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kotohime_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 78 | 60.31 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kotohime_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 142 | 98.81 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kotohime_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/kotohime_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 20 |  |  |  |  |  | 1girl, kimono, solo, smile, ponytail, sash |
| 1 | 7 |  |  |  |  |  | 1girl, long_sleeves, solo, wide_sleeves, bangs, looking_at_viewer, simple_background, smile, yellow_bow, closed_mouth, purple_kimono, white_background, white_kimono, obi, sidelocks |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | kimono | solo | smile | ponytail | sash | long_sleeves | wide_sleeves | bangs | looking_at_viewer | simple_background | yellow_bow | closed_mouth | purple_kimono | white_background | white_kimono | obi | sidelocks |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------|:-------|:--------|:-----------|:-------|:---------------|:---------------|:--------|:--------------------|:--------------------|:-------------|:---------------|:----------------|:-------------------|:---------------|:------|:------------|
| 0 | 20 |  |  |  |  |  | X | X | X | X | X | X | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | | X | X | | | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/kotohime_touhou | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-15T04:21:32+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-15T04:48:09+00:00 |
37e53ba0e3cf7cb8e42d1e2f5deb3adc128206ff |
# Dataset of luize/ルイズ (Touhou)
This is the dataset of luize/ルイズ (Touhou), containing 90 images and their tags.
The core tags of this character are `blonde_hair, hat, yellow_eyes, short_hair, ribbon, twintails, bow, white_headwear`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 90 | 54.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/luize_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 90 | 40.28 MiB | [Download](https://huggingface.co/datasets/CyberHarem/luize_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 130 | 66.00 MiB | [Download](https://huggingface.co/datasets/CyberHarem/luize_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 90 | 51.38 MiB | [Download](https://huggingface.co/datasets/CyberHarem/luize_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 130 | 82.73 MiB | [Download](https://huggingface.co/datasets/CyberHarem/luize_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/luize_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 17 |  |  |  |  |  | 1girl, purple_neckerchief, purple_sailor_collar, short_sleeves, solo, smile, sun_hat, white_shirt, white_skirt, hat_bow, bangs, purple_bow, closed_eyes, medium_hair, closed_mouth, full_body, happy, low_twintails, looking_at_viewer, blush, breasts, open_mouth, simple_background |
| 1 | 14 |  |  |  |  |  | 1girl, solo, smile, dress, closed_eyes, simple_background, white_background, sailor_collar |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | purple_neckerchief | purple_sailor_collar | short_sleeves | solo | smile | sun_hat | white_shirt | white_skirt | hat_bow | bangs | purple_bow | closed_eyes | medium_hair | closed_mouth | full_body | happy | low_twintails | looking_at_viewer | blush | breasts | open_mouth | simple_background | dress | white_background | sailor_collar |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------------|:-----------------------|:----------------|:-------|:--------|:----------|:--------------|:--------------|:----------|:--------|:-------------|:--------------|:--------------|:---------------|:------------|:--------|:----------------|:--------------------|:--------|:----------|:-------------|:--------------------|:--------|:-------------------|:----------------|
| 0 | 17 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | |
| 1 | 14 |  |  |  |  |  | X | | | | X | X | | | | | | | X | | | | | | | | | | X | X | X | X |
| CyberHarem/luize_touhou | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-15T04:21:37+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-15T04:45:18+00:00 |
2215b3f793524d845673bb799434df63520f2393 | vlad775/price | [
"region:us"
] | 2024-01-15T04:35:27+00:00 | {} | 2024-01-15T04:37:32+00:00 |
|
6066e9c6f9e75e18f3625a551087bd44fe8a84e0 |
# Quirky Textbook Trove: Compact Excellence for Small Language Model
Strange dataset is 100% AI-generated, a compilation aligned with the vision of the [Textbooks Are All You Need](https://arxiv.org/abs/2306.11644) and [Textbooks Are All You Need II: phi-1.5 technical report](https://arxiv.org/abs/2309.05463) research. This dataset features 2,7M synthetic textbooks, encapsulating 16GB of raw text data. The unique name reflects its unconventional synthesis methodology, its compact size, deduped, and its emphasis on clear, focused content.
The dataset comprises text documents, each representing a tiny synthetic textbook. The source of this data is advanced open LLM-generated text, ensuring a high-quality, structured representation across a diverse range of subjects.
## Motivation
The creation of the dataset is driven by the need for high-quality, efficient training data. By emulating the principles outlined in the paper, this dataset aims to contribute to the development of more efficient language models that can achieve remarkable performance with less data.
## Usage
Researchers and AI practitioners can leverage this dataset for experiments in language model training, particularly those focused on the efficiency and efficacy of models trained on structured, high-quality data.
### Text Length Distribution
The textbooks in this dataset exhibit the following characteristics in terms of text length (measured in characters):
- **Mean**: 6,456.23
- **Standard Deviation**: 2,559.61
- **25th Percentile**: 4,831
- **Median (50th Percentile)**: 6,265
- **75th Percentile**: 8,048
These statistics indicate a varied range of text lengths, providing a comprehensive dataset suitable for diverse applications in language model training.
## Contribution
Contributions to the dataset are encouraged and valued. Enhancements can range from adding new textbooks to optimizing existing content for better quality and diversity.
## Acknowledgments
The development of this dataset was inspired by the groundbreaking work presented in the paper. I acknowledge the contribution of all the community members and the original authors (Microsoft Research) who have influenced this project.
### Disclaimer
While every effort has been made to ensure the accuracy of the information contained within this dataset, please note that it is provided 'as is' and without any warranties.
The use of the data is intended for research purposes only. You are advised to verify any information obtained from this dataset before acting upon it.
## Tiny Series
Explore the possibilities and limitations of building Small Language Models with these tiny gems of data!
- [TinyStories](https://arxiv.org/abs/2305.07759): The paper that sparked my interest in the journey of the tiny-* series.
- [tiny-codes](https://huggingface.co/datasets/nampdn-ai/tiny-codes): Collection of 1.6M short and clear code snippets that can help LLM models learn how to reason.
- [tiny-textbooks](https://huggingface.co/datasets/nampdn-ai/tiny-textbooks): 420k "things of internet" synthetic textbooks.
- [tiny-code-textbooks](https://huggingface.co/datasets/nampdn-ai/tiny-code-textbooks): Collection of 207k code explanation synthetic textbooks.
- [tiny-math-textbooks](https://huggingface.co/datasets/nampdn-ai/tiny-math-textbooks): Collection of 635k short math textbook on various mathematical topics.
- [tiny-orca-textbooks](https://huggingface.co/datasets/nampdn-ai/tiny-orca-textbooks): Synthetic textbook to help model learn in-context on how it should perform task the right way.
- [tiny-webtext](https://huggingface.co/datasets/nampdn-ai/tiny-webtext): A 6GB (4.5M records) variety of diverse webtext enriched with critical thinking methods to make unbiased English dataset.
- [tiny-lessons](https://huggingface.co/datasets/nampdn-ai/tiny-lessons): Subset of tiny-textbooks dataset, various lessons about "things of internet" augmented in a bite-sized textbook Markdown format.
- [tiny-bridgedict](https://huggingface.co/datasets/nampdn-ai/tiny-bridgedict): A dataset that links and transfers knowledge between English, Vietnamese, Chinese in a tiny multilingual models.
## Citation
```
@misc {nam_pham_2024,
author = { {Nam Pham} },
title = { tiny-strange-textbooks (Revision 6f304f1) },
year = 2024,
url = { https://huggingface.co/datasets/nampdn-ai/tiny-strange-textbooks },
doi = { 10.57967/hf/1612 },
publisher = { Hugging Face }
}
``` | nampdn-ai/tiny-strange-textbooks | [
"task_categories:text-generation",
"size_categories:1M<n<10M",
"language:en",
"license:apache-2.0",
"synthetic",
"arxiv:2306.11644",
"arxiv:2309.05463",
"arxiv:2305.07759",
"doi:10.57967/hf/1612",
"region:us"
] | 2024-01-15T04:39:00+00:00 | {"language": ["en"], "license": "apache-2.0", "size_categories": ["1M<n<10M"], "task_categories": ["text-generation"], "pretty_name": "Tiny Strange Textbooks", "tags": ["synthetic"]} | 2024-02-02T16:15:23+00:00 |
2706d5a2b37e52c240517d7c63caac7b522e96fb | manish2057/chatbot | [
"region:us"
] | 2024-01-15T04:40:08+00:00 | {} | 2024-01-15T04:40:08+00:00 |
|
805f15e5a03238069398bf3596f658d48fd43281 | # Dataset Card for "openhermes_binarized"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | jan-hq/openhermes_binarized | [
"region:us"
] | 2024-01-15T04:46:54+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "messages", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 309587583.1440632, "num_examples": 240402}, {"name": "test", "num_bytes": 3128044.855936845, "num_examples": 2429}], "download_size": 158388623, "dataset_size": 312715628.0}} | 2024-01-15T04:48:45+00:00 |
cc841667724d8a6d6df8ee719f742e1e4dd95b1c | zedamangas/mczimvocal | [
"license:openrail",
"region:us"
] | 2024-01-15T04:50:14+00:00 | {"license": "openrail"} | 2024-01-15T05:37:50+00:00 |
|
2d2015f4b1dd52a3457158ed7e76a8156bbdbda3 | zhihao406/TAMM-DATASET | [
"region:us"
] | 2024-01-15T05:01:24+00:00 | {} | 2024-01-15T07:25:16+00:00 |
|
2b4b880b0d69d5a3ad7381d13dcf3bde914d9e65 | NoahMartinezXiang/CREMA-D | [
"license:apache-2.0",
"region:us"
] | 2024-01-15T05:02:09+00:00 | {"license": "apache-2.0"} | 2024-01-19T14:49:16+00:00 |
|
58806e08bd61d0698b5eb3e799317e10cdb6a482 | AsphyXIA/baarat-hin-v1 | [
"license:mit",
"region:us"
] | 2024-01-15T05:05:51+00:00 | {"license": "mit", "dataset_info": {"features": [{"name": "src", "dtype": "string"}, {"name": "tgt", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1758609731, "num_examples": 5062893}], "download_size": 935211726, "dataset_size": 1758609731}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-15T05:06:52+00:00 |
|
2015ac98e71260fc1c4325dcf790e6d5f725d36a | Sevll/hb | [
"license:apache-2.0",
"region:us"
] | 2024-01-15T05:06:07+00:00 | {"license": "apache-2.0"} | 2024-01-15T05:06:07+00:00 |
|
14a941854dfdedf7df3436ce6e96badd4b44fbc0 | glitchy222222/test | [
"region:us"
] | 2024-01-15T05:06:09+00:00 | {} | 2024-01-15T05:10:08+00:00 |
|
22ea4fa056d40a125811b02e13dc8910df17f19f | SoorajK1/Two_chunks-2893c985-e3c3-492c-865f-95d044e1a438 | [
"region:us"
] | 2024-01-15T05:11:17+00:00 | {} | 2024-01-15T05:11:20+00:00 |
|
5abeacf21c552b34c08501b19452eab8ad4cb06e | # Dataset Card for "dolphin_binarized"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | jan-hq/dolphin_binarized | [
"region:us"
] | 2024-01-15T05:13:04+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "messages", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 1571862982.8863597, "num_examples": 882938}, {"name": "test", "num_bytes": 15878177.113640415, "num_examples": 8919}], "download_size": 856689595, "dataset_size": 1587741160.0}} | 2024-01-15T06:24:10+00:00 |
aa030edf2d1a97e8cc31ee9b57cd6d2233f2d389 | haisonle001/cmc_dedup | [
"region:us"
] | 2024-01-15T05:13:18+00:00 | {"dataset_info": {"features": [{"name": "url", "dtype": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 8266460422, "num_examples": 429350}], "download_size": 2814231645, "dataset_size": 8266460422}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-15T05:55:38+00:00 |
|
2f403a71e09b673fcc15387745d90dba10af54b8 | jilp00/youtoks-transcripts-run01 | [
"region:us"
] | 2024-01-15T05:16:11+00:00 | {"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 7912963, "num_examples": 9358}], "download_size": 4134655, "dataset_size": 7912963}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-15T05:16:22+00:00 |
|
83fa3665302f983c9dddbafdb6539e5acf82f382 | SoorajK1/Two_chunks-ed85502b-921b-4762-94d9-fe87677c46df | [
"region:us"
] | 2024-01-15T05:16:38+00:00 | {} | 2024-01-15T05:16:41+00:00 |
|
a29511d72a28758e7b5b2797507388e57a54ef74 | SoorajK1/Two_chunks-6d3abcb2-6ee7-4299-bb9c-0d2db9026305 | [
"region:us"
] | 2024-01-15T05:16:41+00:00 | {} | 2024-01-15T05:16:43+00:00 |
|
319ef48f3f8ee55f64c416a1ecb438ab9f91a5cf | SoorajK1/Two_chunks-5f11ee4a-70df-4d60-b702-109e24ea02eb | [
"region:us"
] | 2024-01-15T05:16:51+00:00 | {} | 2024-01-15T05:16:53+00:00 |
|
0598372c3f6131ad877e7326fd91f5701ab65b77 | maulinnasari/dataset_ext_20_mn | [
"region:us"
] | 2024-01-15T05:19:01+00:00 | {"dataset_info": {"features": [{"name": "document", "sequence": "string"}, {"name": "summary", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 160065061, "num_examples": 44972}, {"name": "validation", "num_bytes": 19636553, "num_examples": 5622}, {"name": "test", "num_bytes": 19797897, "num_examples": 5622}], "download_size": 124783985, "dataset_size": 199499511}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2024-01-15T05:19:12+00:00 |
|
e755510595b55aec0f28182ca816f5e0c419b0b3 | SoorajK1/Two_chunks-966fbceb-3587-4826-ada8-e63683c086b1 | [
"region:us"
] | 2024-01-15T05:19:12+00:00 | {} | 2024-01-15T05:19:14+00:00 |
|
cc38835304f173951942004d014231f1b95645ab | SoorajK1/Two_chunks-156bc30e-e7c6-4cf0-ad74-5090505a0463 | [
"region:us"
] | 2024-01-15T05:19:24+00:00 | {} | 2024-01-15T05:19:27+00:00 |
|
d32678464cb927d10725bde31398219db9bb42a2 |
# Dataset of elis (Touhou)
This is the dataset of elis (Touhou), containing 108 images and their tags.
The core tags of this character are `blonde_hair, bow, long_hair, wings, hair_bow, bat_wings, pointy_ears, facial_mark, red_bow, hair_ornament, red_eyes, hair_flower, ribbon`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 108 | 90.98 MiB | [Download](https://huggingface.co/datasets/CyberHarem/elis_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 108 | 66.81 MiB | [Download](https://huggingface.co/datasets/CyberHarem/elis_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 208 | 123.64 MiB | [Download](https://huggingface.co/datasets/CyberHarem/elis_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 108 | 85.80 MiB | [Download](https://huggingface.co/datasets/CyberHarem/elis_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 208 | 149.78 MiB | [Download](https://huggingface.co/datasets/CyberHarem/elis_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/elis_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 26 |  |  |  |  |  | 1girl, solo, star_(symbol), skirt, vest, wand, smile, flower |
| 1 | 8 |  |  |  |  |  | 1girl, long_sleeves, red_skirt, solo, star_(symbol), white_shirt, looking_at_viewer, open_vest, red_bowtie, smile, black_vest, closed_mouth, flower, simple_background, long_skirt, white_background, bangs, collared_shirt, holding_wand, arms_behind_back, puffy_sleeves |
| 2 | 8 |  |  |  |  |  | 1girl, red_bowtie, red_skirt, star_(symbol), white_shirt, black_vest, frilled_skirt, full_body, holding_wand, juliet_sleeves, looking_at_viewer, smile, solo, bangs, flower, open_mouth, open_vest, long_skirt, black_footwear, blush, fang, mary_janes, purple_eyes, buttons, chibi, one_eye_closed, puffy_long_sleeves, red_footwear |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | star_(symbol) | skirt | vest | wand | smile | flower | long_sleeves | red_skirt | white_shirt | looking_at_viewer | open_vest | red_bowtie | black_vest | closed_mouth | simple_background | long_skirt | white_background | bangs | collared_shirt | holding_wand | arms_behind_back | puffy_sleeves | frilled_skirt | full_body | juliet_sleeves | open_mouth | black_footwear | blush | fang | mary_janes | purple_eyes | buttons | chibi | one_eye_closed | puffy_long_sleeves | red_footwear |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:----------------|:--------|:-------|:-------|:--------|:---------|:---------------|:------------|:--------------|:--------------------|:------------|:-------------|:-------------|:---------------|:--------------------|:-------------|:-------------------|:--------|:-----------------|:---------------|:-------------------|:----------------|:----------------|:------------|:-----------------|:-------------|:-----------------|:--------|:-------|:-------------|:--------------|:----------|:--------|:-----------------|:---------------------|:---------------|
| 0 | 26 |  |  |  |  |  | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 8 |  |  |  |  |  | X | X | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | |
| 2 | 8 |  |  |  |  |  | X | X | X | | | | X | X | | X | X | X | X | X | X | | | X | | X | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/elis_touhou | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-15T05:19:42+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-15T05:48:28+00:00 |
9ee635b81bf0dcaf517526d68ab2a41db7c2076d |
# Dataset of sara/サラ (Touhou)
This is the dataset of sara/サラ (Touhou), containing 58 images and their tags.
The core tags of this character are `pink_hair, short_hair, pink_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 58 | 32.62 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sara_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 58 | 26.83 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sara_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 92 | 41.71 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sara_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 58 | 31.45 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sara_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 92 | 46.89 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sara_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/sara_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 33 |  |  |  |  |  | 1girl, solo, smile, red_dress, looking_at_viewer, one_side_up, short_sleeves, simple_background, bangs, full_body, open_mouth, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | smile | red_dress | looking_at_viewer | one_side_up | short_sleeves | simple_background | bangs | full_body | open_mouth | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:------------|:--------------------|:--------------|:----------------|:--------------------|:--------|:------------|:-------------|:-------------------|
| 0 | 33 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/sara_touhou | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-15T05:19:45+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-15T05:47:40+00:00 |
38ea21b177794c97f5db2f4bab731e4157a27f26 | wessmetal/andrematos | [
"license:bsd",
"region:us"
] | 2024-01-15T05:28:20+00:00 | {"license": "bsd"} | 2024-01-15T05:29:01+00:00 |
|
ca9521d03c1b3c95d269f3267d834ba8c688cebf | presencesw/webglm_test | [
"region:us"
] | 2024-01-15T05:29:37+00:00 | {"dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}, {"name": "references", "sequence": "string"}, {"name": "len", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 480967.01099153265, "num_examples": 186}, {"name": "validation", "num_bytes": 295057.992, "num_examples": 114}, {"name": "test", "num_bytes": 255117.03, "num_examples": 98}], "download_size": 1063212, "dataset_size": 1031142.0329915327}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2024-01-15T06:17:40+00:00 |
|
e9cf26b57168b10e90b1044e0c10f3c17107c474 | SoorajK1/Two_chunks-743333a1-473d-4f61-8101-7917847e1838 | [
"region:us"
] | 2024-01-15T05:30:13+00:00 | {} | 2024-01-15T05:30:16+00:00 |
|
a917fc8816a8972444d89a3cf78a2f6b6312523c | SoorajK1/Two_chunks-dc59ac67-1a5c-41d9-936f-732b17a9c768 | [
"region:us"
] | 2024-01-15T05:31:03+00:00 | {} | 2024-01-15T05:31:06+00:00 |
|
cc70c44818e2947aee776a36e20df1871f2fe8b8 | SoorajK1/two_chunks_1637_1638 | [
"region:us"
] | 2024-01-15T05:32:07+00:00 | {} | 2024-01-15T05:32:12+00:00 |
|
bffe1ceccea1737655719e13d7ccfac9ecf1508b | SoorajK1/two_chunks_1638_1639 | [
"region:us"
] | 2024-01-15T05:32:37+00:00 | {} | 2024-01-15T05:32:40+00:00 |
|
e01aa9cc275268b34b0f5b5b193026c78a0316cb | epinnock/commit-diffs | [
"region:us"
] | 2024-01-15T05:37:09+00:00 | {"dataset_info": {"features": [{"name": "commit", "dtype": "string"}, {"name": "old_file", "dtype": "string"}, {"name": "new_file", "dtype": "string"}, {"name": "old_contents", "dtype": "string"}, {"name": "new_contents", "dtype": "string"}, {"name": "subject", "dtype": "string"}, {"name": "message", "dtype": "string"}, {"name": "lang", "dtype": "string"}, {"name": "license", "dtype": "string"}, {"name": "repos", "dtype": "string"}, {"name": "diff", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 336540826, "num_examples": 117081}], "download_size": 162155567, "dataset_size": 336540826}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-15T05:37:17+00:00 |
|
7082898e6ef351b58289c7de89b1dc440061a22a | pran1805/CompaniesData | [
"region:us"
] | 2024-01-15T05:50:40+00:00 | {} | 2024-01-15T06:01:23+00:00 |
|
6e43b689582f7e93e8e1667d5fe8c3c51de27096 | # Dataset Card for "oasst2_top1"
* Top 1% conversations of https://huggingface.co/datasets/OpenAssistant/oasst2
* generated using https://github.com/blancsw/deep_4_all/blob/main/datasets/oasst/convert.py | g-ronimo/oasst2_top1 | [
"license:apache-2.0",
"region:us"
] | 2024-01-15T05:54:05+00:00 | {"license": "apache-2.0", "dataset_info": {"features": [{"name": "conversation", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 24247056, "num_examples": 13757}], "download_size": 14029074, "dataset_size": 24247056}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-15T15:38:08+00:00 |
0cc818c08d23abbf02e0ac6dbe6fbd9dbb8a92f7 | # Dataset Card for "oasst2_top1_fr-en-de-es-it"
* Top 1% conversations of https://huggingface.co/datasets/OpenAssistant/oasst2
* language-filtered: fr, en, de, es, ita
* generated using https://github.com/blancsw/deep_4_all/blob/main/datasets/oasst/convert.py
| g-ronimo/oasst2_top1_fr-en-de-es-it | [
"license:apache-2.0",
"region:us"
] | 2024-01-15T05:56:21+00:00 | {"license": "apache-2.0", "dataset_info": {"features": [{"name": "conversation", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 18301524, "num_examples": 10746}], "download_size": 10477478, "dataset_size": 18301524}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-15T15:36:54+00:00 |
e75a4b3158b8ed9296db976269af640613a5ac05 | alvwjy/tokenized_dataset | [
"region:us"
] | 2024-01-15T05:59:30+00:00 | {"dataset_info": {"features": [{"name": "text", "dtype": "string"}, {"name": "input_ids", "sequence": "int32"}, {"name": "attention_mask", "sequence": "int8"}], "splits": [{"name": "train", "num_bytes": 6163554505, "num_examples": 691655}, {"name": "test", "num_bytes": 1548416383, "num_examples": 172914}], "download_size": 3102128888, "dataset_size": 7711970888}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2024-01-15T06:25:56+00:00 |
|
f27ae4338b712c5fa298ea9d6965273c884292db | openerotica/gorgon-lima-v0.1 | [
"license:apache-2.0",
"region:us"
] | 2024-01-15T06:02:43+00:00 | {"license": "apache-2.0"} | 2024-01-15T18:31:43+00:00 |
|
716ee7d981b75bff1623790b8b8edf6075df3ea7 | pran1805/CEO_Database | [
"region:us"
] | 2024-01-15T06:05:25+00:00 | {} | 2024-01-15T09:04:04+00:00 |
|
b95d1592764cf8c71f2d756950d02c842a1b10fb | sjonas50/test | [
"license:creativeml-openrail-m",
"region:us"
] | 2024-01-15T06:05:30+00:00 | {"license": "creativeml-openrail-m"} | 2024-01-15T06:10:43+00:00 |
|
a1a2689323a8ca3f54fbc23142fbc00efa6b577f | Maaz911/llama_data_v1.0.0 | [
"region:us"
] | 2024-01-15T06:17:02+00:00 | {} | 2024-01-15T06:18:06+00:00 |
|
fa145046b6c8c2f52755be3404271ec97292103d | GGLS/mixed_math_data | [
"region:us"
] | 2024-01-15T06:18:27+00:00 | {} | 2024-01-16T12:10:00+00:00 |
|
bbaf6cc5fd77d069a23b4433c832f81f4275a44d | dderr/tdataset | [
"task_categories:text-generation",
"task_categories:question-answering",
"task_categories:table-question-answering",
"size_categories:10K<n<100K",
"language:en",
"license:cc-by-4.0",
"SQL",
"code",
"NLP",
"text-to-sql",
"context-sql",
"spider",
"wikisql",
"sqlglot",
"region:us"
] | 2024-01-15T06:24:28+00:00 | {"language": ["en"], "license": "cc-by-4.0", "size_categories": ["10K<n<100K"], "task_categories": ["text-generation", "question-answering", "table-question-answering", "test-categories"], "pretty_name": "sql-create-context", "tags": ["SQL", "code", "NLP", "text-to-sql", "context-sql", "spider", "wikisql", "sqlglot"]} | 2024-01-15T06:39:32+00:00 |
|
5f3d9dc0a5bea1581c8c6b968444c4235df91965 | wesley7137/physics_zephyrformat_SFT | [
"region:us"
] | 2024-01-15T06:25:09+00:00 | {} | 2024-01-15T06:25:31+00:00 |
|
63e24d2125f0704568d4a25adf2a1247bd16f976 |
# Dataset Card for Evaluation run of deepseek-ai/deepseek-moe-16b-base
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [deepseek-ai/deepseek-moe-16b-base](https://huggingface.co/deepseek-ai/deepseek-moe-16b-base) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_deepseek-ai__deepseek-moe-16b-base",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-15T06:33:48.729928](https://huggingface.co/datasets/open-llm-leaderboard/details_deepseek-ai__deepseek-moe-16b-base/blob/main/results_2024-01-15T06-33-48.729928.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.465522984657348,
"acc_stderr": 0.034469796748715614,
"acc_norm": 0.46990944729307677,
"acc_norm_stderr": 0.03523647567293407,
"mc1": 0.23745410036719705,
"mc1_stderr": 0.014896277441041836,
"mc2": 0.3607930335233562,
"mc2_stderr": 0.01354653975819568
},
"harness|arc:challenge|25": {
"acc": 0.49658703071672355,
"acc_stderr": 0.014611050403244077,
"acc_norm": 0.5324232081911263,
"acc_norm_stderr": 0.014580637569995423
},
"harness|hellaswag|10": {
"acc": 0.5957976498705437,
"acc_stderr": 0.004897340793314379,
"acc_norm": 0.7977494523003386,
"acc_norm_stderr": 0.004008571431483689
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847415,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847415
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3925925925925926,
"acc_stderr": 0.04218506215368879,
"acc_norm": 0.3925925925925926,
"acc_norm_stderr": 0.04218506215368879
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4605263157894737,
"acc_stderr": 0.04056242252249034,
"acc_norm": 0.4605263157894737,
"acc_norm_stderr": 0.04056242252249034
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4716981132075472,
"acc_stderr": 0.0307235352490061,
"acc_norm": 0.4716981132075472,
"acc_norm_stderr": 0.0307235352490061
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5347222222222222,
"acc_stderr": 0.04171115858181618,
"acc_norm": 0.5347222222222222,
"acc_norm_stderr": 0.04171115858181618
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3930635838150289,
"acc_stderr": 0.0372424959581773,
"acc_norm": 0.3930635838150289,
"acc_norm_stderr": 0.0372424959581773
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.04440521906179327,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.04440521906179327
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.37446808510638296,
"acc_stderr": 0.031639106653672915,
"acc_norm": 0.37446808510638296,
"acc_norm_stderr": 0.031639106653672915
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.041857744240220554,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.041857744240220554
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.04164188720169377,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.04164188720169377
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.29365079365079366,
"acc_stderr": 0.023456037383982022,
"acc_norm": 0.29365079365079366,
"acc_norm_stderr": 0.023456037383982022
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.04134913018303316,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.04134913018303316
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.4870967741935484,
"acc_stderr": 0.028434533152681855,
"acc_norm": 0.4870967741935484,
"acc_norm_stderr": 0.028434533152681855
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2955665024630542,
"acc_stderr": 0.032104944337514575,
"acc_norm": 0.2955665024630542,
"acc_norm_stderr": 0.032104944337514575
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5393939393939394,
"acc_stderr": 0.03892207016552012,
"acc_norm": 0.5393939393939394,
"acc_norm_stderr": 0.03892207016552012
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.03540294377095367,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.03540294377095367
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.616580310880829,
"acc_stderr": 0.03508984236295341,
"acc_norm": 0.616580310880829,
"acc_norm_stderr": 0.03508984236295341
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.41025641025641024,
"acc_stderr": 0.02493931390694078,
"acc_norm": 0.41025641025641024,
"acc_norm_stderr": 0.02493931390694078
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.026466117538959912,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.026466117538959912
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4327731092436975,
"acc_stderr": 0.03218358107742613,
"acc_norm": 0.4327731092436975,
"acc_norm_stderr": 0.03218358107742613
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2582781456953642,
"acc_stderr": 0.035737053147634576,
"acc_norm": 0.2582781456953642,
"acc_norm_stderr": 0.035737053147634576
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6293577981651376,
"acc_stderr": 0.02070745816435298,
"acc_norm": 0.6293577981651376,
"acc_norm_stderr": 0.02070745816435298
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.33796296296296297,
"acc_stderr": 0.03225941352631295,
"acc_norm": 0.33796296296296297,
"acc_norm_stderr": 0.03225941352631295
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5441176470588235,
"acc_stderr": 0.03495624522015478,
"acc_norm": 0.5441176470588235,
"acc_norm_stderr": 0.03495624522015478
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6118143459915611,
"acc_stderr": 0.031722950043323296,
"acc_norm": 0.6118143459915611,
"acc_norm_stderr": 0.031722950043323296
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5515695067264574,
"acc_stderr": 0.03337883736255098,
"acc_norm": 0.5515695067264574,
"acc_norm_stderr": 0.03337883736255098
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5801526717557252,
"acc_stderr": 0.043285772152629715,
"acc_norm": 0.5801526717557252,
"acc_norm_stderr": 0.043285772152629715
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5537190082644629,
"acc_stderr": 0.0453793517794788,
"acc_norm": 0.5537190082644629,
"acc_norm_stderr": 0.0453793517794788
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.04832853553437056,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.04832853553437056
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5337423312883436,
"acc_stderr": 0.03919415545048411,
"acc_norm": 0.5337423312883436,
"acc_norm_stderr": 0.03919415545048411
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.375,
"acc_stderr": 0.04595091388086298,
"acc_norm": 0.375,
"acc_norm_stderr": 0.04595091388086298
},
"harness|hendrycksTest-management|5": {
"acc": 0.6407766990291263,
"acc_stderr": 0.04750458399041694,
"acc_norm": 0.6407766990291263,
"acc_norm_stderr": 0.04750458399041694
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7435897435897436,
"acc_stderr": 0.02860595370200425,
"acc_norm": 0.7435897435897436,
"acc_norm_stderr": 0.02860595370200425
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6436781609195402,
"acc_stderr": 0.0171258537627559,
"acc_norm": 0.6436781609195402,
"acc_norm_stderr": 0.0171258537627559
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.47398843930635837,
"acc_stderr": 0.02688264343402289,
"acc_norm": 0.47398843930635837,
"acc_norm_stderr": 0.02688264343402289
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5522875816993464,
"acc_stderr": 0.02847293847803353,
"acc_norm": 0.5522875816993464,
"acc_norm_stderr": 0.02847293847803353
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5241157556270096,
"acc_stderr": 0.028365041542564577,
"acc_norm": 0.5241157556270096,
"acc_norm_stderr": 0.028365041542564577
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5030864197530864,
"acc_stderr": 0.02782021415859437,
"acc_norm": 0.5030864197530864,
"acc_norm_stderr": 0.02782021415859437
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.32269503546099293,
"acc_stderr": 0.027889139300534785,
"acc_norm": 0.32269503546099293,
"acc_norm_stderr": 0.027889139300534785
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3494132985658409,
"acc_stderr": 0.012177306252786698,
"acc_norm": 0.3494132985658409,
"acc_norm_stderr": 0.012177306252786698
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3897058823529412,
"acc_stderr": 0.029624663581159703,
"acc_norm": 0.3897058823529412,
"acc_norm_stderr": 0.029624663581159703
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.44281045751633985,
"acc_stderr": 0.020095083154577347,
"acc_norm": 0.44281045751633985,
"acc_norm_stderr": 0.020095083154577347
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.509090909090909,
"acc_stderr": 0.04788339768702861,
"acc_norm": 0.509090909090909,
"acc_norm_stderr": 0.04788339768702861
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5265306122448979,
"acc_stderr": 0.03196412734523272,
"acc_norm": 0.5265306122448979,
"acc_norm_stderr": 0.03196412734523272
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6567164179104478,
"acc_stderr": 0.03357379665433431,
"acc_norm": 0.6567164179104478,
"acc_norm_stderr": 0.03357379665433431
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42168674698795183,
"acc_stderr": 0.03844453181770917,
"acc_norm": 0.42168674698795183,
"acc_norm_stderr": 0.03844453181770917
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6783625730994152,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.6783625730994152,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23745410036719705,
"mc1_stderr": 0.014896277441041836,
"mc2": 0.3607930335233562,
"mc2_stderr": 0.01354653975819568
},
"harness|winogrande|5": {
"acc": 0.7371744277821626,
"acc_stderr": 0.012370922527262006
},
"harness|gsm8k|5": {
"acc": 0.1728582259287339,
"acc_stderr": 0.01041543224620057
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_deepseek-ai__deepseek-moe-16b-base | [
"region:us"
] | 2024-01-15T06:35:55+00:00 | {"pretty_name": "Evaluation run of deepseek-ai/deepseek-moe-16b-base", "dataset_summary": "Dataset automatically created during the evaluation run of model [deepseek-ai/deepseek-moe-16b-base](https://huggingface.co/deepseek-ai/deepseek-moe-16b-base) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_deepseek-ai__deepseek-moe-16b-base\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-15T06:33:48.729928](https://huggingface.co/datasets/open-llm-leaderboard/details_deepseek-ai__deepseek-moe-16b-base/blob/main/results_2024-01-15T06-33-48.729928.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.465522984657348,\n \"acc_stderr\": 0.034469796748715614,\n \"acc_norm\": 0.46990944729307677,\n \"acc_norm_stderr\": 0.03523647567293407,\n \"mc1\": 0.23745410036719705,\n \"mc1_stderr\": 0.014896277441041836,\n \"mc2\": 0.3607930335233562,\n \"mc2_stderr\": 0.01354653975819568\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.49658703071672355,\n \"acc_stderr\": 0.014611050403244077,\n \"acc_norm\": 0.5324232081911263,\n \"acc_norm_stderr\": 0.014580637569995423\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5957976498705437,\n \"acc_stderr\": 0.004897340793314379,\n \"acc_norm\": 0.7977494523003386,\n \"acc_norm_stderr\": 0.004008571431483689\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847415,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847415\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3925925925925926,\n \"acc_stderr\": 0.04218506215368879,\n \"acc_norm\": 0.3925925925925926,\n \"acc_norm_stderr\": 0.04218506215368879\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.4605263157894737,\n \"acc_stderr\": 0.04056242252249034,\n \"acc_norm\": 0.4605263157894737,\n \"acc_norm_stderr\": 0.04056242252249034\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.4716981132075472,\n \"acc_stderr\": 0.0307235352490061,\n \"acc_norm\": 0.4716981132075472,\n \"acc_norm_stderr\": 0.0307235352490061\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5347222222222222,\n \"acc_stderr\": 0.04171115858181618,\n \"acc_norm\": 0.5347222222222222,\n \"acc_norm_stderr\": 0.04171115858181618\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3930635838150289,\n \"acc_stderr\": 0.0372424959581773,\n \"acc_norm\": 0.3930635838150289,\n \"acc_norm_stderr\": 0.0372424959581773\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.04440521906179327,\n \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.04440521906179327\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.37446808510638296,\n \"acc_stderr\": 0.031639106653672915,\n \"acc_norm\": 0.37446808510638296,\n \"acc_norm_stderr\": 0.031639106653672915\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n \"acc_stderr\": 0.041857744240220554,\n \"acc_norm\": 0.2719298245614035,\n \"acc_norm_stderr\": 0.041857744240220554\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.04164188720169377,\n \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.04164188720169377\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.29365079365079366,\n \"acc_stderr\": 0.023456037383982022,\n \"acc_norm\": 0.29365079365079366,\n \"acc_norm_stderr\": 0.023456037383982022\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30952380952380953,\n \"acc_stderr\": 0.04134913018303316,\n \"acc_norm\": 0.30952380952380953,\n \"acc_norm_stderr\": 0.04134913018303316\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.4870967741935484,\n \"acc_stderr\": 0.028434533152681855,\n \"acc_norm\": 0.4870967741935484,\n \"acc_norm_stderr\": 0.028434533152681855\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.2955665024630542,\n \"acc_stderr\": 0.032104944337514575,\n \"acc_norm\": 0.2955665024630542,\n \"acc_norm_stderr\": 0.032104944337514575\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.5393939393939394,\n \"acc_stderr\": 0.03892207016552012,\n \"acc_norm\": 0.5393939393939394,\n \"acc_norm_stderr\": 0.03892207016552012\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.03540294377095367,\n \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.03540294377095367\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.616580310880829,\n \"acc_stderr\": 0.03508984236295341,\n \"acc_norm\": 0.616580310880829,\n \"acc_norm_stderr\": 0.03508984236295341\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.41025641025641024,\n \"acc_stderr\": 0.02493931390694078,\n \"acc_norm\": 0.41025641025641024,\n \"acc_norm_stderr\": 0.02493931390694078\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2518518518518518,\n \"acc_stderr\": 0.026466117538959912,\n \"acc_norm\": 0.2518518518518518,\n \"acc_norm_stderr\": 0.026466117538959912\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.4327731092436975,\n \"acc_stderr\": 0.03218358107742613,\n \"acc_norm\": 0.4327731092436975,\n \"acc_norm_stderr\": 0.03218358107742613\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2582781456953642,\n \"acc_stderr\": 0.035737053147634576,\n \"acc_norm\": 0.2582781456953642,\n \"acc_norm_stderr\": 0.035737053147634576\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.6293577981651376,\n \"acc_stderr\": 0.02070745816435298,\n \"acc_norm\": 0.6293577981651376,\n \"acc_norm_stderr\": 0.02070745816435298\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.33796296296296297,\n \"acc_stderr\": 0.03225941352631295,\n \"acc_norm\": 0.33796296296296297,\n \"acc_norm_stderr\": 0.03225941352631295\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.5441176470588235,\n \"acc_stderr\": 0.03495624522015478,\n \"acc_norm\": 0.5441176470588235,\n \"acc_norm_stderr\": 0.03495624522015478\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.6118143459915611,\n \"acc_stderr\": 0.031722950043323296,\n \"acc_norm\": 0.6118143459915611,\n \"acc_norm_stderr\": 0.031722950043323296\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5515695067264574,\n \"acc_stderr\": 0.03337883736255098,\n \"acc_norm\": 0.5515695067264574,\n \"acc_norm_stderr\": 0.03337883736255098\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5801526717557252,\n \"acc_stderr\": 0.043285772152629715,\n \"acc_norm\": 0.5801526717557252,\n \"acc_norm_stderr\": 0.043285772152629715\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.5537190082644629,\n \"acc_stderr\": 0.0453793517794788,\n \"acc_norm\": 0.5537190082644629,\n \"acc_norm_stderr\": 0.0453793517794788\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.04832853553437056,\n \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.04832853553437056\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.5337423312883436,\n \"acc_stderr\": 0.03919415545048411,\n \"acc_norm\": 0.5337423312883436,\n \"acc_norm_stderr\": 0.03919415545048411\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.375,\n \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.375,\n \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6407766990291263,\n \"acc_stderr\": 0.04750458399041694,\n \"acc_norm\": 0.6407766990291263,\n \"acc_norm_stderr\": 0.04750458399041694\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7435897435897436,\n \"acc_stderr\": 0.02860595370200425,\n \"acc_norm\": 0.7435897435897436,\n \"acc_norm_stderr\": 0.02860595370200425\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6436781609195402,\n \"acc_stderr\": 0.0171258537627559,\n \"acc_norm\": 0.6436781609195402,\n \"acc_norm_stderr\": 0.0171258537627559\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.47398843930635837,\n \"acc_stderr\": 0.02688264343402289,\n \"acc_norm\": 0.47398843930635837,\n \"acc_norm_stderr\": 0.02688264343402289\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5522875816993464,\n \"acc_stderr\": 0.02847293847803353,\n \"acc_norm\": 0.5522875816993464,\n \"acc_norm_stderr\": 0.02847293847803353\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5241157556270096,\n \"acc_stderr\": 0.028365041542564577,\n \"acc_norm\": 0.5241157556270096,\n \"acc_norm_stderr\": 0.028365041542564577\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5030864197530864,\n \"acc_stderr\": 0.02782021415859437,\n \"acc_norm\": 0.5030864197530864,\n \"acc_norm_stderr\": 0.02782021415859437\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.32269503546099293,\n \"acc_stderr\": 0.027889139300534785,\n \"acc_norm\": 0.32269503546099293,\n \"acc_norm_stderr\": 0.027889139300534785\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3494132985658409,\n \"acc_stderr\": 0.012177306252786698,\n \"acc_norm\": 0.3494132985658409,\n \"acc_norm_stderr\": 0.012177306252786698\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.3897058823529412,\n \"acc_stderr\": 0.029624663581159703,\n \"acc_norm\": 0.3897058823529412,\n \"acc_norm_stderr\": 0.029624663581159703\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.44281045751633985,\n \"acc_stderr\": 0.020095083154577347,\n \"acc_norm\": 0.44281045751633985,\n \"acc_norm_stderr\": 0.020095083154577347\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.509090909090909,\n \"acc_stderr\": 0.04788339768702861,\n \"acc_norm\": 0.509090909090909,\n \"acc_norm_stderr\": 0.04788339768702861\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.5265306122448979,\n \"acc_stderr\": 0.03196412734523272,\n \"acc_norm\": 0.5265306122448979,\n \"acc_norm_stderr\": 0.03196412734523272\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6567164179104478,\n \"acc_stderr\": 0.03357379665433431,\n \"acc_norm\": 0.6567164179104478,\n \"acc_norm_stderr\": 0.03357379665433431\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42168674698795183,\n \"acc_stderr\": 0.03844453181770917,\n \"acc_norm\": 0.42168674698795183,\n \"acc_norm_stderr\": 0.03844453181770917\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.6783625730994152,\n \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.6783625730994152,\n \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23745410036719705,\n \"mc1_stderr\": 0.014896277441041836,\n \"mc2\": 0.3607930335233562,\n \"mc2_stderr\": 0.01354653975819568\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7371744277821626,\n \"acc_stderr\": 0.012370922527262006\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1728582259287339,\n \"acc_stderr\": 0.01041543224620057\n }\n}\n```", "repo_url": "https://huggingface.co/deepseek-ai/deepseek-moe-16b-base", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_15T06_33_48.729928", "path": ["**/details_harness|arc:challenge|25_2024-01-15T06-33-48.729928.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-15T06-33-48.729928.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_15T06_33_48.729928", "path": ["**/details_harness|gsm8k|5_2024-01-15T06-33-48.729928.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-15T06-33-48.729928.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_15T06_33_48.729928", "path": ["**/details_harness|hellaswag|10_2024-01-15T06-33-48.729928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-15T06-33-48.729928.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_15T06_33_48.729928", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-15T06-33-48.729928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-15T06-33-48.729928.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-15T06-33-48.729928.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_15T06_33_48.729928", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T06-33-48.729928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T06-33-48.729928.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_15T06_33_48.729928", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-15T06-33-48.729928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-15T06-33-48.729928.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_15T06_33_48.729928", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-15T06-33-48.729928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-15T06-33-48.729928.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_15T06_33_48.729928", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T06-33-48.729928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T06-33-48.729928.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_15T06_33_48.729928", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T06-33-48.729928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T06-33-48.729928.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_15T06_33_48.729928", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-15T06-33-48.729928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-15T06-33-48.729928.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_15T06_33_48.729928", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T06-33-48.729928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T06-33-48.729928.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_15T06_33_48.729928", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T06-33-48.729928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T06-33-48.729928.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_15T06_33_48.729928", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T06-33-48.729928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T06-33-48.729928.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_15T06_33_48.729928", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T06-33-48.729928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T06-33-48.729928.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_15T06_33_48.729928", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-15T06-33-48.729928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-15T06-33-48.729928.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_15T06_33_48.729928", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-15T06-33-48.729928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-15T06-33-48.729928.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_15T06_33_48.729928", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T06-33-48.729928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T06-33-48.729928.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_15T06_33_48.729928", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-15T06-33-48.729928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-15T06-33-48.729928.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_15T06_33_48.729928", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T06-33-48.729928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T06-33-48.729928.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_15T06_33_48.729928", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T06-33-48.729928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T06-33-48.729928.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_15T06_33_48.729928", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T06-33-48.729928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T06-33-48.729928.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_15T06_33_48.729928", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-15T06-33-48.729928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-15T06-33-48.729928.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_15T06_33_48.729928", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T06-33-48.729928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T06-33-48.729928.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_15T06_33_48.729928", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T06-33-48.729928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T06-33-48.729928.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_15T06_33_48.729928", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T06-33-48.729928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T06-33-48.729928.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_15T06_33_48.729928", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T06-33-48.729928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T06-33-48.729928.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_15T06_33_48.729928", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T06-33-48.729928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T06-33-48.729928.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_15T06_33_48.729928", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T06-33-48.729928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T06-33-48.729928.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_15T06_33_48.729928", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T06-33-48.729928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T06-33-48.729928.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_15T06_33_48.729928", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T06-33-48.729928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T06-33-48.729928.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_15T06_33_48.729928", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T06-33-48.729928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T06-33-48.729928.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_15T06_33_48.729928", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T06-33-48.729928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T06-33-48.729928.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_15T06_33_48.729928", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T06-33-48.729928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T06-33-48.729928.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_15T06_33_48.729928", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T06-33-48.729928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T06-33-48.729928.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_15T06_33_48.729928", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T06-33-48.729928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T06-33-48.729928.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_15T06_33_48.729928", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T06-33-48.729928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T06-33-48.729928.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_15T06_33_48.729928", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-15T06-33-48.729928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-15T06-33-48.729928.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_15T06_33_48.729928", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T06-33-48.729928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T06-33-48.729928.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_15T06_33_48.729928", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-15T06-33-48.729928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-15T06-33-48.729928.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_15T06_33_48.729928", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T06-33-48.729928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T06-33-48.729928.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_15T06_33_48.729928", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T06-33-48.729928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T06-33-48.729928.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_15T06_33_48.729928", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T06-33-48.729928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T06-33-48.729928.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_15T06_33_48.729928", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-15T06-33-48.729928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-15T06-33-48.729928.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_15T06_33_48.729928", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-15T06-33-48.729928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-15T06-33-48.729928.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_15T06_33_48.729928", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T06-33-48.729928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T06-33-48.729928.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_15T06_33_48.729928", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T06-33-48.729928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T06-33-48.729928.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_15T06_33_48.729928", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T06-33-48.729928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T06-33-48.729928.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_15T06_33_48.729928", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T06-33-48.729928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T06-33-48.729928.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_15T06_33_48.729928", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-15T06-33-48.729928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-15T06-33-48.729928.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_15T06_33_48.729928", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-15T06-33-48.729928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-15T06-33-48.729928.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_15T06_33_48.729928", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-15T06-33-48.729928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-15T06-33-48.729928.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_15T06_33_48.729928", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T06-33-48.729928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T06-33-48.729928.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_15T06_33_48.729928", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-15T06-33-48.729928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-15T06-33-48.729928.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_15T06_33_48.729928", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T06-33-48.729928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T06-33-48.729928.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_15T06_33_48.729928", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T06-33-48.729928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T06-33-48.729928.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_15T06_33_48.729928", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-15T06-33-48.729928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-15T06-33-48.729928.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_15T06_33_48.729928", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-15T06-33-48.729928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-15T06-33-48.729928.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_15T06_33_48.729928", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-15T06-33-48.729928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-15T06-33-48.729928.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_15T06_33_48.729928", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T06-33-48.729928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T06-33-48.729928.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_15T06_33_48.729928", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-15T06-33-48.729928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-15T06-33-48.729928.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_15T06_33_48.729928", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-15T06-33-48.729928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-15T06-33-48.729928.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_15T06_33_48.729928", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-15T06-33-48.729928.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-15T06-33-48.729928.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_15T06_33_48.729928", "path": ["**/details_harness|winogrande|5_2024-01-15T06-33-48.729928.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-15T06-33-48.729928.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_15T06_33_48.729928", "path": ["results_2024-01-15T06-33-48.729928.parquet"]}, {"split": "latest", "path": ["results_2024-01-15T06-33-48.729928.parquet"]}]}]} | 2024-01-15T06:36:22+00:00 |
ac07080f11c15b38c1a5cdc02b0be856bd3abb80 | [
{
"id": "0",
"translation": {
"es": "8a - 4b + 16c + 12d",
"pt": "Para factorizar la expresión (8a - 4b + 16c + 12d), primero agrupemos los términos de manera adecuada. La expresión se puede reorganizar en dos grupos: (8a - 4b) + (16c + 12d). Ahora, en cada grupo, factorizamos los términos comunes: Grupo 1: Factor común de (4) en (8a - 4b): 4(2a - b) . Grupo 2: Factor común de (4) en (16c + 12d): 4(4c + 3d). Finalmente, podemos escribir la expresión factorizada como la suma de los dos grupos factorizados: 4(2a - b) + 4(4c + 3d) "
}
},
{
"id": "1",
"translation": {
"es": "7x^2 + 11x^3 - 4x^5 + 3x^4 - x^8",
"pt": "Para factorizar la expresión (7x^2 + 11x^3 - 4x^5 + 3x^4 - x^8), primero ordenemos los términos en orden descendente según las potencias de (x): -x^8 - 4x^5 + 3x^4 + 11x^3 + 7x^2. Ahora, identifiquemos el factor común. En este caso, el factor común es (x^2). Factorizamos (x^2) de cada término: x^2(-x^6 - 4x^3 + 3x^2 + 11x + 7). Entonces, la expresión factorizada es (x^2(-x^6 - 4x^3 + 3x^2 + 11x + 7))."
}
},
{
"id": "2",
"translation": {
"es": "9x^3 - 6x^2 + 12x^5 - 18x^7",
"pt": "Para factorizar la expresión (9x^3 - 6x^2 + 12x^5 - 18x^7), primero ordenemos los términos en orden descendente según las potencias de (x): -18x^7 + 12x^5 + 9x^3 - 6x^2. Ahora, identifiquemos el factor común. En este caso, el factor común es (3x^2). Factorizamos (3x^2) de cada término: 3x^2(-6x^5 + 4x^3 + 3x - 2). Entonces, la expresión factorizada es (3x^2(-6x^5 + 4x^3 + 3x - 2))."
}
},
{
"id": "3",
"translation": {
"es": "5x+5y",
"pt": "La expresión (5x + 5y) ya está factorizada en términos de primer grado. Sin embargo, si deseas factorizar el término común, puedes hacerlo de la siguiente manera: 5x + 5y. Factor común: (5). 5(x + y). Entonces, la expresión factorizada es (5(x + y))."
}
},
{
"id": "4",
"translation": {
"es": "-6a-9b-3c",
"pt": "El proceso de factorización de la expresión (-6a - 9b - 3c): Agrupamos los términos: (-6a - 9b) - 3c. En el primer grupo, factorizamos el común factor (-3) de (-6a - 9b): -3(2a + 3b) - 3c. Ahora, podemos factorizar el común factor (-3) del primer grupo: -3(2a + 3b + c). Por lo tanto, la expresión (-6a - 9b - 3c) factoriza como (-3(2a + 3b + c))."
}
},
{
"id": "5",
"translation": {
"es": "x^2 + 2x",
"pt": "Para factorizar la expresión (x^2 + 2x). Vamos a realizar el proceso de factorización paso a paso. Factor común: Primero, observamos si hay algún factor común en ambos términos. En este caso, (x) es un factor común. x(x + 2). Verificación: Podemos verificar si hemos factorizado correctamente multiplicando los factores para asegurarnos de que obtenemos la expresión original. x(x + 2) = x^2 + 2x. La expresión original es igual a la factorización, por lo que hemos factorizado correctamente. Entonces, la factorización de (x^2 + 2x) es (x(x + 2))."
}
},
{
"id": "6",
"translation": {
"es": "4x^2 - 12x",
"pt": "Para factorizar la expresión (4x^2 - 12x), primero identificamos el factor común. En este caso, el factor común es (4x). Ahora, factorizamos (4x) de cada término: 4x(x - 3). Entonces, la expresión factorizada es (4x(x - 3))."
}
},
{
"id": "7",
"translation": {
"es": "2x^2 + 6xy + 4y^2",
"pt": "Para factorizar la expresión (2x^2 + 6xy + 4y^2), primero identificamos el factor común. En este caso, el factor común es (2). Ahora, factorizamos (2) de cada término: 2(x^2 + 3xy + 2y^2). Luego, observamos que (x^2 + 3xy + 2y^2) es una expresión cuadrática perfecta llamada cuadrado de un binomio. Por lo tanto, factorizamos esta expresión como sigue: (x + y)^2. Finalmente, la expresión factorizada es (2(x + y)^2)."
}
},
{
"id": "8",
"translation": {
"es": "x^2 - 9",
"pt": "Para factorizar la expresión (x^2 - 9), primero observamos que es una diferencia de cuadrados. Por lo tanto, podemos factorizarla como sigue: x^2 - 9 = (x + 3)(x - 3). Aquí, (x + 3) y (x - 3) son los dos factores binomiales que, al multiplicarlos, nos dan la expresión original."
}
},
{
"id": "9",
"translation": {
"es": "4x^2 - 25",
"pt": "Para factorizar la expresión (4x^2 - 25), primero observamos que es una diferencia de cuadrados. Por lo tanto, podemos factorizarla como sigue: 4x^2 - 25 = (2x + 5)(2x - 5). Aquí, (2x + 5) y (2x - 5) son los dos factores binomiales que, al multiplicarlos, nos dan la expresión original."
}
},
{
"id": "10",
"translation": {
"es": "x^3 - 8",
"pt": "Para factorizar la expresión (x^3 - 8), primero observamos que es un cubo perfecto. Por lo tanto, podemos factorizarla como sigue: x^3 - 8 = (x - 2)(x^2 + 2x + 4). Aquí, (x - 2) es el factor binomial y (x^2 + 2x + 4) es el factor trinomial que, al multiplicarlos, nos dan la expresión original."
}
},
{
"id": "11",
"translation": {
"es": "27x^3 + 8",
"pt": "Para factorizar la expresión (27x^3 + 8), primero observamos que es una suma de cubos. Por lo tanto, podemos factorizarla como sigue: 27x^3 + 8 = (3x + 2)(9x^2 - 6x + 4). Aquí, (3x + 2) es el factor binomial y (9x^2 - 6x + 4) es el factor trinomial que, al multiplicarlos, nos dan la expresión original."
}
},
{
"id": "12",
"translation": {
"es": "8x^2 - 6x",
"pt": "Para factorizar la expresión (8x^2 - 6x), primero observamos que el coeficiente del primer término es 8 y el coeficiente del segundo término es -6. Buscando dos números que multiplicados den 8 y sumados den -6, encontramos que los números son 8 y -1. Por lo tanto, podemos factorizar la expresión de la siguiente manera: 8x^2 - 6x = 8x^2 - 8x + 2x - 6. Ahora, podemos agrupar los términos: (8x^2 - 8x) + (2x - 6). Factoremos cada grupo: 8x(x - 1) + 2(x - 3). (x - 1) es un factor común en ambos términos, por lo tanto, factorizamos (x - 1) de ambos términos: (x - 1)(8x + 2). Finalmente, la expresión factorizada es (x - 1)(8x + 2)."
}
},
{
"id": "13",
"translation": {
"es": "12x^2 - 15x - 18",
"pt": "Para factorizar la expresión (12x^2 - 15x - 18), primero ordenamos los términos en orden descendente según las potencias de (x): 12x^2 - 15x - 18. Ahora, buscamos dos números que multiplicados den -18 y sumados den -15. Los números que cumplen estas condiciones son 9 y -2. Por lo tanto, podemos factorizar la expresión de la siguiente manera: 12x^2 - 15x - 18 = 12x^2 + 9x - 24x - 18. Ahora, podemos agrupar los términos: (12x^2 + 9x) - (24x + 18). Factoremos cada grupo: 3x(4x + 3) - 6(4x + 3). (4x + 3) es un factor común en ambos términos, por lo tanto, factorizamos (4x + 3) de ambos términos: (4x + 3)(3x - 6). Finalmente, la expresión factorizada es (4x + 3)(3x - 6)."
}
},
{
"id": "14",
"translation": {
"es": "x^3 - 2x^2 - 4x + 8",
"pt": "Para factorizar la expresión (x^3 - 2x^2 - 4x + 8), primero agrupamos los términos de manera adecuada. La expresión se puede reorganizar en dos grupos: (x^3 - 2x^2) - (4x - 8). Ahora, en cada grupo, factorizamos los términos comunes: Grupo 1: Factor común de (x^2) en (x^3 - 2x^2): x^2(x - 2). Grupo 2: Factor común de (4) en (4x - 8): 4(x - 2). Finalmente, podemos escribir la expresión factorizada como la suma de los dos grupos factorizados: x^2(x - 2) - 4(x - 2)."
}
},
{
"id": "15",
"translation": {
"es": "4x + 8y - 12z",
"pt": "Para factorizar la expresión (4x + 8y - 12z), primero debemos encontrar el factor común de los términos. En este caso, el factor común es (4). Luego, factorizamos el término común de cada uno de los términos: 4(x + 2y - 3z). Por lo tanto, la expresión factorizada es (4(x + 2y - 3z))."
}
},
{
"id": "16",
"translation": {
"es": "10a - 15b + 20c",
"pt": "Para factorizar la expresión (10a - 15b + 20c), primero debemos encontrar el factor común de los términos. En este caso, el factor común es (5). Luego, factorizamos el término común de cada uno de los términos: 5(2a - 3b + 4c). Por lo tanto, la expresión factorizada es (5(2a - 3b + 4c))."
}
},
{
"id": "17",
"translation": {
"es": "12x^2 + 18x^3 - 24x^4",
"pt": "Para factorizar la expresión (12x^2 + 18x^3 - 24x^4), primero debemos encontrar el factor común de los términos. En este caso, el factor común es (6x^2). Luego, factorizamos el término común de cada uno de los términos: 6x^2(2 + 3x - 4x^2). Por lo tanto, la expresión factorizada es (6x^2(2 + 3x - 4x^2))."
}
},
{
"id": "18",
"translation": {
"es": "8a^3 - 12a^2 + 16a",
"pt": "Para factorizar la expresión (8a^3 - 12a^2 + 16a), primero debemos encontrar el factor común de los términos. En este caso, el factor común es (4a). Luego, factorizamos el término común de cada uno de los términos: 4a(2a^2 - 3a + 4). Por lo tanto, la expresión factorizada es (4a(2a^2 - 3a + 4))."
}
},
{
"id": "19",
"translation": {
"es": "10x^2 - 15x",
"pt": "Para factorizar la expresión (10x^2 - 15x), primero identifiquemos el factor común: 5x. Factorizamos 5x de cada término: 5x(2x - 3). Entonces, la expresión factorizada es (5x(2x - 3))."
}
},
{
"id": "20",
"translation": {
"es": "8y^3 + 12y^2 - 4y",
"pt": "Para factorizar la expresión (8y^3 + 12y^2 - 4y), primero identifiquemos el factor común: 4y. Factorizamos 4y de cada término: 4y(2y^2 + 3y - 1). Entonces, la expresión factorizada es (4y(2y^2 + 3y - 1))."
}
},
{
"id": "21",
"translation": {
"es": "14a^3 - 21a^2 + 7a",
"pt": "Para factorizar la expresión (14a^3 - 21a^2 + 7a), primero identifiquemos el factor común: 7a. Factorizamos 7a de cada término: 7a(2a^2 - 3a + 1). Entonces, la expresión factorizada es (7a(2a^2 - 3a + 1))."
}
},
{
"id": "22",
"translation": {
"es": "9x^2 + 12xy + 4y^2",
"pt": "Para factorizar la expresión (9x^2 + 12xy + 4y^2), primero ordenemos los términos de manera adecuada. La expresión se puede reorganizar en tres grupos: (9x^2 + 12xy) + (4y^2). Ahora, en cada grupo, factorizamos los términos comunes: Grupo 1: Factor común de (3x) en (9x^2 + 12xy): 3x(3x + 4y). Grupo 2: Factor común de (4) en (4y^2): 4y^2. Finalmente, podemos escribir la expresión factorizada como la suma de los dos grupos factorizados: 3x(3x + 4y) + 4y^2."
}
},
{
"id": "23",
"translation": {
"es": "3(x^2 + 2x + 1) - 4(2x^2 - 3x + 5)",
"pt": "Para factorizar esta expresión, empezaremos extrayendo el factor común más grande de cada término. En este caso, el factor común es (x). 3(x^2 + 2x + 1) - 4(2x^2 - 3x + 5). Factorizando el factor común, obtenemos: 3(x(x + 2 + 1)) - 4(2x(x - 3/2 + 5/2)). Expandiendo los términos, tenemos: 3(x(x + 3)) - 4(2x(x + 11/2)). Ahora, podemos simplificar la expresión combinando los términos semejantes: 3x(x + 3) - 8x(x + 11/2). Finalmente, la expresión factorizada es: 3x(x + 3) - 8x(x + 11/2)."
}
},
{
"id": "24",
"translation": {
"es": "7x^2y^3 + 14x^3y^4 - 21xy^5",
"pt": "Para factorizar esta expresión, primero buscamos el factor común más grande de todos los términos. En este caso, el factor común es (xy^2). 7x^2y^3 + 14x^3y^4 - 21xy^5. Factorizando el factor común, obtenemos: xy^2(7xy + 14x^2y^2 - 21y^3). Ahora, podemos factorizar aún más el término dentro del paréntesis extrayendo el factor común (7y). xy^2(7xy + 14x^2y^2 - 21y^3). Factoreando el factor común, tenemos: xy^2(7y(x + 2xy^2 - 3y^2)). Por último, podemos simplificar la expresión combinando los términos semejantes: xy^2(7y(x - 3y^2 + 2xy^2)). Finalmente, la expresión factorizada es: xy^2(7y(x - 3y^2 + 2xy^2))."
}
},
{
"id": "25",
"translation": {
"es": "-2x^2 + 8x + 6",
"pt": "Para factorizar esta expresión, empezaremos extrayendo el factor común más grande de cada término. En este caso, el factor común es (2). -2x^2 + 8x + 6. Factorizando el factor común, obtenemos: 2(-x^2 + 4x + 3). Ahora, podemos factorizar aún más el término dentro del paréntesis extrayendo el factor común (x - 1). 2(-(x^2 - 4x - 3)). Por último, podemos simplificar la expresión combinando los términos semejantes: 2(-(x - 1)(x - 3)). Finalmente, la expresión factorizada es: 2(x - 1)(x - 3)."
}
},
{
"id": "26",
"translation": {
"es": "3x^3 - 9x^2 - 12x",
"pt": "Para factorizar esta expresión, empezaremos extrayendo el factor común más grande de cada término. En este caso, el factor común es (3x). 3x^3 - 9x^2 - 12x. Factorizando el factor común, obtenemos: 3x(x^2 - 3x - 4). Ahora, podemos factorizar aún más el término dentro del paréntesis extrayendo el factor común (x + 1). 3x((x^2 - 3x - 4)). Por último, podemos simplificar la expresión combinando los términos semejantes: 3x(x + 1)(x - 4). Finalmente, la expresión factorizada es: 3x(x + 1)(x - 4)."
}
},
{
"id": "27",
"translation": {
"es": "2x^2 + 3xy - 5y^2",
"pt": "Para factorizar esta expresión, empezaremos extrayendo el factor común más grande de cada término. En este caso, no hay un factor común obvio. Por lo tanto, podemos intentar factorizar por agrupación. 2x^2 + 3xy - 5y^2. Agrupando los dos primeros términos y los dos últimos términos, obtenemos: (2x^2 + 3xy) - (5y^2). Ahora, podemos factorizar cada grupo. 2x^2 + 3xy = x(2x + 3y). 5y^2 = 5y(y). Finalmente, la expresión factorizada es: x(2x + 3y) - 5y(y)."
}
},
{
"id": "28",
"translation": {
"es": "3x^2 - 9x + 6",
"pt": "Para factorizar la expresión (3x^2 - 9x + 6), primero verificamos si se puede extraer un factor común. En este caso, el factor común es (3): 3(x^2 - 3x + 2). Luego, factorizamos el polinomio cuadrático (x^2 - 3x + 2): (x - 1)(x - 2). Por lo tanto, la expresión factorizada es 3(x - 1)(x - 2)."
}
},
{
"id": "29",
"translation": {
"es": "2a^2 - 8a + 6",
"pt": "Para factorizar la expresión (2a^2 - 8a + 6), primero verificamos si se puede extraer un factor común. En este caso, el factor común es (2): 2(a^2 - 4a + 3). Luego, factorizamos el polinomio cuadrático (a^2 - 4a + 3): (a - 1)(a - 3). Por lo tanto, la expresión factorizada es 2(a - 1)(a - 3)."
}
},
{
"id": "30",
"translation": {
"es": "4x^2 + 12x + 9",
"pt": "Para factorizar la expresión (4x^2 + 12x + 9), primero verificamos si se puede extraer un factor común. En este caso, el factor común es (1): 1(4x^2 + 12x + 9). Luego, factorizamos el polinomio cuadrático (4x^2 + 12x + 9): (2x + 3)(2x + 3). Por lo tanto, la expresión factorizada es (2x + 3)^2."
}
},
{
"id": "31",
"translation": {
"es": "x^2 - 5x - 14",
"pt": "Para factorizar la expresión (x^2 - 5x - 14), primero verificamos si se puede extraer un factor común. En este caso, no hay un factor común. Por lo tanto, factorizamos el polinomio cuadrático (x^2 - 5x - 14): (x - 7)(x + 2). Por lo tanto, la expresión factorizada es (x - 7)(x + 2)."
}
},
{
"id": "32",
"translation": {
"es": "2x^2 + 5x + 3",
"pt": "Para factorizar la expresión (2x^2 + 5x + 3), primero verificamos si se puede extraer un factor común. En este caso, no hay un factor común. Por lo tanto, factorizamos el polinomio cuadrático (2x^2 + 5x + 3): (2x + 3)(x + 1). Por lo tanto, la expresión factorizada es (2x + 3)(x + 1)."
}
},
{
"id": "33",
"translation": {
"es": "3x^2 - 4x - 7",
"pt": "Para factorizar la expresión (3x^2 - 4x - 7), primero verificamos si se puede extraer un factor común. En este caso, no hay un factor común. Por lo tanto, factorizamos el polinomio cuadrático (3x^2 - 4x - 7): (3x + 1)(x - 7). Por lo tanto, la expresión factorizada es (3x + 1)(x - 7)."
}
},
{
"id": "34",
"translation": {
"es": "4x^2 + 9x + 5",
"pt": "Para factorizar la expresión (4x^2 + 9x + 5), primero verificamos si se puede extraer un factor común. En este caso, no hay un factor común. Por lo tanto, factorizamos el polinomio cuadrático (4x^2 + 9x + 5): (4x + 5)(x + 1). Por lo tanto, la expresión factorizada es (4x + 5)(x + 1)."
}
},
{
"id": "35",
"translation": {
"es": "3x^2 - 12x + 12",
"pt": "Para factorizar la expresión (3x^2 - 12x + 12), primero sacamos el factor común (3) de todos los términos: 3(x^2 - 4x + 4). Ahora, reconocemos que la expresión dentro del paréntesis es un cuadrado perfecto: (x - 2)^2. Entonces, la expresión factorizada es 3(x - 2)^2."
}
},
{
"id": "36",
"translation": {
"es": "2x(x - 1) + 3(x - 1)",
"pt": "Para factorizar la expresión (2x(x - 1) + 3(x - 1)), primero identificamos el factor común en ambos términos: (x - 1). Factorizamos (x - 1) de la expresión: (x - 1)(2x + 3). Por lo tanto, la expresión factorizada es ((x - 1)(2x + 3))."
}
},
{
"id": "37",
"translation": {
"es": "p^2 + 4pq + 4q^2",
"pt": "Para factorizar la expresión (p^2 + 4pq + 4q^2), primero identifiquemos el factor común. En este caso, el factor común es (p + 2q). Factorizamos (p + 2q) de la expresión: (p + 2q)(p + 2q). Por lo tanto, la expresión factorizada es ((p + 2q)(p + 2q))."
}
},
{
"id": "38",
"translation": {
"es": "p^2 + 4pq + 4q^2",
"pt": "Para factorizar la expresión (p^2 + 4pq + 4q^2), primero identifiquemos el factor común. En este caso, el factor común es (p + 2q). Factorizamos (p + 2q) de la expresión: (p + 2q)(p + 2q). Por lo tanto, la expresión factorizada es ((p + 2q)(p + 2q))."
}
},
{
"id": "39",
"translation": {
"es": "10x^2 + 20x + 10",
"pt": "Para factorizar la expresión (10x^2 + 20x + 10), primero identifiquemos el factor común. En este caso, el factor común es (10). Factorizamos (10) de cada término: 10(x^2 + 2x + 1). Ahora, factorizamos el trinomio cuadrado(x^2 + 2x + 1) utilizando la fórmula de la suma de dos cuadrados: (x + 1)^2. Entonces, la expresión factorizada es (10(x + 1)^2)."
}
},
{
"id": "40",
"translation": {
"es": "9x^2 - 25",
"pt": "Para factorizar la expresión (9x^2 - 25), primero identifiquemos el factor común. En este caso, el factor común es (1). Entonces, la expresión ya está factorizada en términos de primer grado."
}
},
{
"id": "41",
"translation": {
"es": "4x^2 - 9y^2",
"pt": "Para factorizar la expresión (4x^2 - 9y^2), primero identifiquemos el factor común. En este caso, el factor común es (1). Entonces, la expresión ya está factorizada en términos de primer grado."
}
},
{
"id": "42",
"translation": {
"es": "12x^3 - 18x^2 + 6x",
"pt": "Para factorizar la expresión (12x^3 - 18x^2 + 6x), primero identifiquemos el factor común. En este caso, el factor común es (6x). Factorizamos (6x) de cada término: 6x(2x^2 - 3x + 1). Ahora, factorizamos el trinomio cuadrado(2x^2 - 3x + 1) utilizando la fórmula de la factorización de x^2 + bx + c: (2x - 1)(x - 1). Entonces, la expresión factorizada es (6x(2x - 1)(x - 1))."
}
},
{
"id": "43",
"translation": {
"es": "10x^2-20x+30",
"pt": "Para factorizar la expresión (10x^2 - 20x + 30), primero identificamos el factor común: 10. Factorizamos 10 de cada término: 10(x^2 - 2x + 3). Ahora, necesitamos factorizar el trinomio cuadrático (x^2 - 2x + 3). Podemos usar el método del cuadrado perfecto para factorizarlo: (x - 1)^2. Por lo tanto, la expresión factorizada es 10(x - 1)^2."
}
},
{
"id": "44",
"translation": {
"es": "12x^3-9x^2+6x",
"pt": "Para factorizar la expresión (12x^3 - 9x^2 + 6x), primero identificamos el factor común: 3x. Factorizamos 3x de cada término: 3x(4x^2 - 3x + 2). Ahora, factorizamos el trinomio cuadrático (4x^2 - 3x + 2) usando el método de la factorización: (2x - 1)(2x - 2). Por lo tanto, la expresión factorizada es 3x(2x - 1)(2x - 2)."
}
},
{
"id": "45",
"translation": {
"es": "15x^4-20x^3+10x^2",
"pt": "Para factorizar la expresión (15x^4 - 20x^3 + 10x^2), primero identificamos el factor común: 5x^2. Factorizamos 5x^2 de cada término: 5x^2(3x^2 - 4x + 2). Ahora, factorizamos el trinomio cuadrático (3x^2 - 4x + 2) usando el método de la factorización: (3x - 2)(x - 1). Por lo tanto, la expresión factorizada es 5x^2(3x - 2)(x - 1)."
}
},
{
"id": "46",
"translation": {
"es": "2x^3-8x^2+6x",
"pt": "Para factorizar la expresión (2x^3 - 8x^2 + 6x), primero identificamos el factor común: 2x. Factorizamos 2x de cada término: 2x(x^2 - 4x + 3). Ahora, factorizamos el trinomio cuadrático (x^2 - 4x + 3) usando el método de la factorización: (x - 1)(x - 3). Por lo tanto, la expresión factorizada es 2x(x - 1)(x - 3)."
}
},
{
"id": "47",
"translation": {
"es": "12x^2 - 16x + 20x^3 - 28x^4",
"pt": "Para factorizar la expresión (12x^2 - 16x + 20x^3 - 28x^4), primero ordenemos los términos en orden descendente según las potencias de (x): -28x^4 + 20x^3 + 12x^2 - 16x. Ahora, identifiquemos el factor común. En este caso, el factor común es (4x). Factorizamos (4x) de cada término: 4x(-7x^3 + 5x^2 + 3x - 4). Entonces, la expresión factorizada es (4x(-7x^3 + 5x^2 + 3x - 4))."
}
},
{
"id": "48",
"translation": {
"es": "14x^2y^3 - 21xy^2 + 7xy - 14xy^4",
"pt": "Para factorizar la expresión (14x^2y^3 - 21xy^2 + 7xy - 14xy^4), primero ordenemos los términos en orden descendente según las potencias de (x) y (y): -14xy^4 + 14x^2y^3 - 21xy^2 + 7xy. Ahora, identifiquemos el factor común. En este caso, el factor común es (7xy). Factorizamos (7xy) de cada término: 7xy(-2y^3 + 2x^2 - 3y + 1). Entonces, la expresión factorizada es (7xy(-2y^3 + 2x^2 - 3y + 1))."
}
},
{
"id": "49",
"translation": {
"es": "8x^3 - 4x^2 + 12x - 6",
"pt": "Para factorizar la expresión (8x^3 - 4x^2 + 12x - 6), primero ordenemos los términos en orden descendente según las potencias de (x): 8x^3 - 4x^2 + 12x - 6. Ahora, identifiquemos el factor común. En este caso, el factor común es (2). Factorizamos (2) de cada término: 2(4x^3 - 2x^2 + 6x - 3). Entonces, la expresión factorizada es (2(4x^3 - 2x^2 + 6x - 3))."
}
},
{
"id": "50",
"translation": {
"es": "10x^2y^3 - 20xy + 30xy^2 - 15xy^4",
"pt": "Para factorizar la expresión (10x^2y^3 - 20xy + 30xy^2 - 15xy^4), primero ordenemos los términos en orden descendente según las potencias de (x) y (y): -15xy^4 + 10x^2y^3 + 30xy^2 - 20xy. Ahora, identifiquemos el factor común. En este caso, el factor común es (5xy). Factorizamos (5xy) de cada término: 5xy(-3y^3 + 2x^2 + 6y - 4). Entonces, la expresión factorizada es (5xy(-3y^3 + 2x^2 + 6y - 4))."
}
}
]
| spongebob01/formulas | [
"region:us"
] | 2024-01-15T06:37:14+00:00 | {} | 2024-01-15T08:00:49+00:00 |
11320541b712538d54744c764b71a3985df14204 | Cgk1000/funniRVCdatasetformodels | [
"license:openrail",
"region:us"
] | 2024-01-15T06:44:37+00:00 | {"license": "openrail"} | 2024-01-15T06:45:07+00:00 |
|
b9ab3628a2b4e26b269638669ea553036887d4fd | maulinnasari/dataset_ext_75_mn | [
"region:us"
] | 2024-01-15T06:44:42+00:00 | {"dataset_info": {"features": [{"name": "document", "sequence": "string"}, {"name": "summary", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 422166589, "num_examples": 44972}, {"name": "validation", "num_bytes": 51569079, "num_examples": 5622}, {"name": "test", "num_bytes": 52113083, "num_examples": 5622}], "download_size": 309012659, "dataset_size": 525848751}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2024-01-15T06:44:56+00:00 |
|
9f62e4473ef9bd94f2763d20076f1e4e89c2ed07 | maulinnasari/dataset_ext_50_mn | [
"region:us"
] | 2024-01-15T06:47:25+00:00 | {"dataset_info": {"features": [{"name": "document", "sequence": "string"}, {"name": "summary", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 297808585, "num_examples": 44972}, {"name": "validation", "num_bytes": 36387952, "num_examples": 5622}, {"name": "test", "num_bytes": 36752761, "num_examples": 5622}], "download_size": 222818544, "dataset_size": 370949298}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2024-01-15T06:47:36+00:00 |
|
3f8d2414fb5e60ddba1ea4711b6684d0a536cf04 | maulinnasari/dataset_ext_25_mn | [
"region:us"
] | 2024-01-15T06:49:18+00:00 | {"dataset_info": {"features": [{"name": "document", "sequence": "string"}, {"name": "summary", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 182738113, "num_examples": 44972}, {"name": "validation", "num_bytes": 22402143, "num_examples": 5622}, {"name": "test", "num_bytes": 22597567, "num_examples": 5622}], "download_size": 141206629, "dataset_size": 227737823}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2024-01-15T06:49:24+00:00 |
|
fd9e7255f3faabea05c229dc8e3be52cd3ee0232 | maulinnasari/dataset_ext_15_mn | [
"region:us"
] | 2024-01-15T06:50:25+00:00 | {"dataset_info": {"features": [{"name": "document", "sequence": "string"}, {"name": "summary", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 138378746, "num_examples": 44972}, {"name": "validation", "num_bytes": 16994675, "num_examples": 5622}, {"name": "test", "num_bytes": 17112258, "num_examples": 5622}], "download_size": 109003001, "dataset_size": 172485679}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2024-01-15T06:50:31+00:00 |
|
142320106c8c57e12dd1443713c9091300a0e5c3 | Gincy/finet.csv | [
"region:us"
] | 2024-01-15T06:50:52+00:00 | {} | 2024-01-15T06:50:54+00:00 |
|
0675c0c424f1dcd175c89226730998d572c13fe0 | blackriderrx/mini-platypus-2 | [
"region:us"
] | 2024-01-15T06:55:44+00:00 | {} | 2024-01-15T06:55:44+00:00 |
|
f4d51ac8744bd766ba4b56a9dd6e39f03500a088 | manishiitg/chat-instruct-hi-v2 | [
"region:us"
] | 2024-01-15T07:01:56+00:00 | {"dataset_info": {"features": [{"name": "text", "dtype": "string"}, {"name": "lang", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1393655848.7799177, "num_examples": 262916}], "download_size": 665296003, "dataset_size": 1393655848.7799177}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-15T07:02:49+00:00 |
|
949f623121d6d516a613456e7b9a9596f5493554 | danielheart/stable-diffusion | [
"license:unknown",
"region:us"
] | 2024-01-15T07:04:54+00:00 | {"license": "unknown"} | 2024-01-15T07:04:54+00:00 |
Subsets and Splits