sha
stringlengths 40
40
| text
stringlengths 0
13.4M
| id
stringlengths 2
117
| tags
sequence | created_at
stringlengths 25
25
| metadata
stringlengths 2
31.7M
| last_modified
stringlengths 25
25
|
---|---|---|---|---|---|---|
0acb7d1d0c8448c4322030de54affc03b38accc2 | ### The ReAct Llama Dataset
### Dataset Summary
This dataset contains 3,538 correct ReAct trajectories generated using llama2-70b (Q5_K_M quant).
It follows the format used in the [ReAct paper](https://arxiv.org/pdf/2210.03629.pdf).\
ReAct trajectories were generated using a modified version of the hotpotqa.ipynb file from the [ReAct repo](https://github.com/ysymyth/ReAct).
The model was prompted in the following format (5-shot) to generate these traces:
```
Solve a question answering task with interleaving Thought, Action, Observation steps. Thought can reason about the current situation, and Action can be three types:
(1) Search[entity], which searches the exact entity on Wikipedia and returns the first paragraph if it exists. If not, it will return some similar entities to search.
(2) Lookup[keyword], which returns the next sentence containing keyword in the current passage.
(3) Finish[answer], which returns the answer and finishes the task.
Solve a question answering task with interleaving Thought, Action, Observation steps. Thought can reason about the current situation, and Action can be three types:
(1) Search[entity], which searches the exact entity on Wikipedia and returns the first paragraph if it exists. If not, it will return some similar entities to search.
(2) Lookup[keyword], which returns the next sentence containing keyword in the current passage.
(3) Finish[answer], which returns the answer and finishes the task.
Here are some examples.
Question: What is the elevation range for the area that the eastern sector of the Colorado orogeny extends into?
Thought 1: I need to search Colorado orogeny, find the area that the eastern sector of the Colorado orogeny extends into, then find the elevation range of the area.
Action 1: Search[Colorado orogeny]
Observation 1: The Colorado orogeny was an episode of mountain building (an orogeny) in Colorado and surrounding areas.
Thought 2: It does not mention the eastern sector. So I need to look up eastern sector.
Action 2: Lookup[eastern sector]
Observation 2: (Result 1 / 1) The eastern sector extends into the High Plains and is called the Central Plains orogeny.
Thought 3: The eastern sector of Colorado orogeny extends into the High Plains. So I need to search High Plains and find its elevation range.
Action 3: Search[High Plains]
Observation 3: High Plains refers to one of two distinct land regions:
Thought 4: I need to instead search High Plains (United States).
Action 4: Search[High Plains (United States)]
Observation 4: The High Plains are a subregion of the Great Plains. From east to west, the High Plains rise in elevation from around 1,800 to 7,000 ft (550 to 2,130 m).[3]
Thought 5: High Plains rise in elevation from around 1,800 to 7,000 ft, so the answer is 1,800 to 7,000 ft.
Action 5: Finish[1,800 to 7,000 ft]
Question: Musician and satirist Allie Goertz wrote a song about the "The Simpsons" character Milhouse, who Matt Groening named after who?
Thought 1: The question simplifies to "The Simpsons" character Milhouse is named after who. I only need to search Milhouse and find who it is named after.
Action 1: Search[Milhouse]
Observation 1: Milhouse Mussolini Van Houten is a recurring character in the Fox animated television series The Simpsons voiced by Pamela Hayden and created by Matt Groening.
Thought 2: The paragraph does not tell who Milhouse is named after, maybe I can look up "named after".
Action 2: Lookup[named after]
Observation 2: (Result 1 / 1) Milhouse was named after U.S. president Richard Nixon, whose middle name was Milhous.
Thought 3: Milhouse was named after U.S. president Richard Nixon, so the answer is Richard Nixon.
Action 3: Finish[Richard Nixon]
Question: Which documentary is about Finnish rock groups, Adam Clayton Powell or The Saimaa Gesture?
Thought 1: I need to search Adam Clayton Powell and The Saimaa Gesture, and find which documentary is about Finnish rock groups.
Action 1: Search[Adam Clayton Powell]
Observation 1: Could not find [Adam Clayton Powell]. Similar: ['Adam Clayton Powell III', 'Seventh Avenue (Manhattan)', 'Adam Clayton Powell Jr. State Office Building', 'Isabel Washington Powell', 'Adam Powell', 'Adam Clayton Powell (film)', 'Giancarlo Esposito'].
Thought 2: To find the documentary, I can search Adam Clayton Powell (film).
Action 2: Search[Adam Clayton Powell (film)]
Observation 2: Adam Clayton Powell is a 1989 American documentary film directed by Richard Kilberg.
The film is about the rise and fall of influential African-American politician Adam Clayton Powell Jr.[3][4] It was later aired as part of the PBS series The American Experience.
Thought 3: Adam Clayton Powell (film) is a documentary about an African-American politician, not Finnish rock groups. So the documentary about Finnish rock groups must instead be The Saimaa Gesture.
Action 3: Finish[The Saimaa Gesture]
Question: What profession does Nicholas Ray and Elia Kazan have in common?
Thought 1: I need to search Nicholas Ray and Elia Kazan, find their professions, then find the profession they have in common.
Action 1: Search[Nicholas Ray]
Observation 1: Nicholas Ray (born Raymond Nicholas Kienzle Jr., August 7, 1911 - June 16, 1979) was an American film director, screenwriter, and actor best known for the 1955 film Rebel Without a Cause.
Thought 2: Professions of Nicholas Ray are director, screenwriter, and actor. I need to search Elia Kazan next and find his professions.
Action 2: Search[Elia Kazan]
Observation 2: Elia Kazan was an American film and theatre director, producer, screenwriter and actor.
Thought 3: Professions of Elia Kazan are director, producer, screenwriter, and actor. So profession Nicholas Ray and Elia Kazan have in common is director, screenwriter, and actor.
Action 3: Finish[director, screenwriter, actor]
Question: Which magazine was started first Arthur's Magazine or First for Women?
Thought 1: I need to search Arthur's Magazine and First for Women, and find which was started first.
Action 1: Search[Arthur's Magazine]
Observation 1: Arthur's Magazine (1844-1846) was an American literary periodical published in Philadelphia in the 19th century.
Thought 2: Arthur's Magazine was started in 1844. I need to search First for Women next.
Action 2: Search[First for Women]
Observation 2: First for Women is a woman's magazine published by Bauer Media Group in the USA.[1] The magazine was started in 1989.
Thought 3: First for Women was started in 1989. 1844 (Arthur's Magazine) < 1989 (First for Women), so Arthur's Magazine was started first.
Action 3: Finish[Arthur's Magazine]
Question: Were Pavel Urysohn and Leonid Levin known for the same type of work?
Thought 1: I need to search Pavel Urysohn and Leonid Levin, find their types of work, then find if they are the same.
Action 1: Search[Pavel Urysohn]
Observation 1: Pavel Samuilovich Urysohn (February 3, 1898 - August 17, 1924) was a Soviet mathematician who is best known for his contributions in dimension theory.
Thought 2: Pavel Urysohn is a mathematician. I need to search Leonid Levin next and find its type of work.
Action 2: Search[Leonid Levin]
Observation 2: Leonid Anatolievich Levin is a Soviet-American mathematician and computer scientist.
Thought 3: Leonid Levin is a mathematician and computer scientist. So Pavel Urysohn and Leonid Levin have the same type of work.
Action 3: Finish[yes]
Question: <insert-question-here>
```
The Wikipedia API tool that the language model has access to here is unmodified from the code given in the ReAct repository. | xz56/react-llama | [
"size_categories:1K<n<10K",
"language:en",
"license:apache-2.0",
"arxiv:2210.03629",
"region:us"
] | 2024-02-16T07:56:23+00:00 | {"language": ["en"], "license": "apache-2.0", "size_categories": ["1K<n<10K"], "dataset_info": {"features": [{"name": "id", "dtype": "int64"}, {"name": "question", "dtype": "string"}, {"name": "correct_answer", "dtype": "string"}, {"name": "trajectory", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 7686150, "num_examples": 3538}], "download_size": 4306541, "dataset_size": 7686150}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-16T08:09:45+00:00 |
3f9fee40a912eab1231171e1e445e6cc14b07fb5 | trappy/katland_cat2_XL | [
"region:us"
] | 2024-02-16T08:04:59+00:00 | {} | 2024-02-16T08:09:42+00:00 |
|
caaee27b80ebfab109ff004f71674fee227bfc23 | trappy/katlandcat2XL | [
"region:us"
] | 2024-02-16T08:06:52+00:00 | {} | 2024-02-16T08:06:52+00:00 |
|
1021ed76cd4026593836e84cc08f32746853f27d | EllieS/pubmedqa_dpo_selfgen_data | [
"region:us"
] | 2024-02-16T08:11:22+00:00 | {} | 2024-02-16T08:40:09+00:00 |
|
9fbee48b840dcccfd4ae3ff57ce43dada6520590 | mahdi402/mbot | [
"region:us"
] | 2024-02-16T08:11:45+00:00 | {} | 2024-02-16T08:23:25+00:00 |
|
af9a14e10696e29a45b22294e56b6d54ddc1e998 | chavinlo/hdvila5ktest_group | [
"region:us"
] | 2024-02-16T08:12:19+00:00 | {} | 2024-02-16T08:12:31+00:00 |
|
666c39e41de0434a675597aaf0a2126a0baa2ed3 | AbhishekJosyula/Wikisun_QA_Format | [
"size_categories:100K<n<1M",
"language:en",
"license:apache-2.0",
"region:us"
] | 2024-02-16T08:12:34+00:00 | {"language": ["en"], "license": "apache-2.0", "size_categories": ["100K<n<1M"], "pretty_name": "WikisumQA", "dataset_info": {"features": [{"name": "Text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 954740029, "num_examples": 180889}], "download_size": 569372772, "dataset_size": 954740029}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-16T08:57:49+00:00 |
|
efd0f6ebca27ff79724663f0bfd1d6e463252469 | jlbaker361/cyberpunk-500-cropped | [
"region:us"
] | 2024-02-16T08:19:24+00:00 | {"dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "text", "dtype": "string"}, {"name": "frame", "dtype": "int64"}, {"name": "title", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 104772251.0, "num_examples": 490}], "download_size": 104750440, "dataset_size": 104772251.0}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-16T08:19:28+00:00 |
|
ad8d5cf34fd2a2ea07613b6904cfa18fa0d245d1 |
[Open-Orca/1million-gpt-4](https://huggingface.co/datasets/Open-Orca/1million-gpt-4) converted to sharegpt and llama chat format
| sanjay920/1million-gpt-4-llama | [
"region:us"
] | 2024-02-16T08:20:36+00:00 | {"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "conversations", "list": [{"name": "from", "dtype": "string"}, {"name": "value", "dtype": "string"}, {"name": "weight", "dtype": "null"}]}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 3630771342, "num_examples": 994896}], "download_size": 1980759415, "dataset_size": 3630771342}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-16T08:26:04+00:00 |
dffaf1d6f0c55a489334bbdca2b447ab7bb473d6 | sanjay920/OpenHermes-2.5-llama | [
"region:us"
] | 2024-02-16T08:23:13+00:00 | {"dataset_info": {"features": [{"name": "topic", "dtype": "string"}, {"name": "views", "dtype": "int64"}, {"name": "source", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "skip_prompt_formatting", "dtype": "bool"}, {"name": "idx", "dtype": "string"}, {"name": "model", "dtype": "string"}, {"name": "conversations", "list": [{"name": "from", "dtype": "string"}, {"name": "value", "dtype": "string"}, {"name": "weight", "dtype": "float64"}]}, {"name": "language", "dtype": "string"}, {"name": "hash", "sequence": "int64"}, {"name": "model_name", "dtype": "string"}, {"name": "custom_instruction", "dtype": "bool"}, {"name": "system_prompt", "dtype": "string"}, {"name": "id", "dtype": "string"}, {"name": "category", "dtype": "string"}, {"name": "avatarUrl", "dtype": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 3277566452, "num_examples": 1001551}], "download_size": 1592211603, "dataset_size": 3277566452}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-16T08:24:57+00:00 |
|
2305c24f0c2504ec9f12cff0bc6d0c560d16922b | alisson40889/milagre | [
"license:openrail",
"region:us"
] | 2024-02-16T08:26:01+00:00 | {"license": "openrail"} | 2024-02-16T08:26:54+00:00 |
|
acb85808ec4bc4b499f049aad2a7693799b0b1a4 |
## Dataset Description
This dataset is just for testing. It contains GitHub issues and pull requests associated with the 🤗 Datasets [repository](https://github.com/huggingface/datasets). It can be used for semantic search or multilabel text classification. The contents of each GitHub issue are in English.
| lorisrossi/github-issues | [
"task_categories:text-classification",
"task_ids:multi-label-classification",
"annotations_creators:no-annotation",
"language_creators:found",
"multilinguality:monolingual",
"size_categories:1K<n<10K",
"language:en",
"region:us"
] | 2024-02-16T08:27:54+00:00 | {"annotations_creators": ["no-annotation"], "language_creators": ["found"], "language": ["en"], "license": [], "multilinguality": ["monolingual"], "size_categories": ["1K<n<10K"], "source_datasets": [], "task_categories": ["text-classification"], "task_ids": ["multi-label-classification"], "pretty_name": "HuggingFace Datasets GitHub Issues", "tags": [], "dataset_info": {"features": [{"name": "url", "dtype": "string"}, {"name": "repository_url", "dtype": "string"}, {"name": "labels_url", "dtype": "string"}, {"name": "comments_url", "dtype": "string"}, {"name": "events_url", "dtype": "string"}, {"name": "html_url", "dtype": "string"}, {"name": "id", "dtype": "int64"}, {"name": "node_id", "dtype": "string"}, {"name": "number", "dtype": "int64"}, {"name": "title", "dtype": "string"}, {"name": "user", "struct": [{"name": "login", "dtype": "string"}, {"name": "id", "dtype": "int64"}, {"name": "node_id", "dtype": "string"}, {"name": "avatar_url", "dtype": "string"}, {"name": "gravatar_id", "dtype": "string"}, {"name": "url", "dtype": "string"}, {"name": "html_url", "dtype": "string"}, {"name": "followers_url", "dtype": "string"}, {"name": "following_url", "dtype": "string"}, {"name": "gists_url", "dtype": "string"}, {"name": "starred_url", "dtype": "string"}, {"name": "subscriptions_url", "dtype": "string"}, {"name": "organizations_url", "dtype": "string"}, {"name": "repos_url", "dtype": "string"}, {"name": "events_url", "dtype": "string"}, {"name": "received_events_url", "dtype": "string"}, {"name": "type", "dtype": "string"}, {"name": "site_admin", "dtype": "bool"}]}, {"name": "labels", "list": [{"name": "id", "dtype": "int64"}, {"name": "node_id", "dtype": "string"}, {"name": "url", "dtype": "string"}, {"name": "name", "dtype": "string"}, {"name": "color", "dtype": "string"}, {"name": "default", "dtype": "bool"}, {"name": "description", "dtype": "string"}]}, {"name": "state", "dtype": "string"}, {"name": "locked", "dtype": "bool"}, {"name": "assignee", "struct": [{"name": "login", "dtype": "string"}, {"name": "id", "dtype": "int64"}, {"name": "node_id", "dtype": "string"}, {"name": "avatar_url", "dtype": "string"}, {"name": "gravatar_id", "dtype": "string"}, {"name": "url", "dtype": "string"}, {"name": "html_url", "dtype": "string"}, {"name": "followers_url", "dtype": "string"}, {"name": "following_url", "dtype": "string"}, {"name": "gists_url", "dtype": "string"}, {"name": "starred_url", "dtype": "string"}, {"name": "subscriptions_url", "dtype": "string"}, {"name": "organizations_url", "dtype": "string"}, {"name": "repos_url", "dtype": "string"}, {"name": "events_url", "dtype": "string"}, {"name": "received_events_url", "dtype": "string"}, {"name": "type", "dtype": "string"}, {"name": "site_admin", "dtype": "bool"}]}, {"name": "assignees", "list": [{"name": "login", "dtype": "string"}, {"name": "id", "dtype": "int64"}, {"name": "node_id", "dtype": "string"}, {"name": "avatar_url", "dtype": "string"}, {"name": "gravatar_id", "dtype": "string"}, {"name": "url", "dtype": "string"}, {"name": "html_url", "dtype": "string"}, {"name": "followers_url", "dtype": "string"}, {"name": "following_url", "dtype": "string"}, {"name": "gists_url", "dtype": "string"}, {"name": "starred_url", "dtype": "string"}, {"name": "subscriptions_url", "dtype": "string"}, {"name": "organizations_url", "dtype": "string"}, {"name": "repos_url", "dtype": "string"}, {"name": "events_url", "dtype": "string"}, {"name": "received_events_url", "dtype": "string"}, {"name": "type", "dtype": "string"}, {"name": "site_admin", "dtype": "bool"}]}, {"name": "milestone", "struct": [{"name": "url", "dtype": "string"}, {"name": "html_url", "dtype": "string"}, {"name": "labels_url", "dtype": "string"}, {"name": "id", "dtype": "int64"}, {"name": "node_id", "dtype": "string"}, {"name": "number", "dtype": "int64"}, {"name": "title", "dtype": "string"}, {"name": "description", "dtype": "string"}, {"name": "creator", "struct": [{"name": "login", "dtype": "string"}, {"name": "id", "dtype": "int64"}, {"name": "node_id", "dtype": "string"}, {"name": "avatar_url", "dtype": "string"}, {"name": "gravatar_id", "dtype": "string"}, {"name": "url", "dtype": "string"}, {"name": "html_url", "dtype": "string"}, {"name": "followers_url", "dtype": "string"}, {"name": "following_url", "dtype": "string"}, {"name": "gists_url", "dtype": "string"}, {"name": "starred_url", "dtype": "string"}, {"name": "subscriptions_url", "dtype": "string"}, {"name": "organizations_url", "dtype": "string"}, {"name": "repos_url", "dtype": "string"}, {"name": "events_url", "dtype": "string"}, {"name": "received_events_url", "dtype": "string"}, {"name": "type", "dtype": "string"}, {"name": "site_admin", "dtype": "bool"}]}, {"name": "open_issues", "dtype": "int64"}, {"name": "closed_issues", "dtype": "int64"}, {"name": "state", "dtype": "string"}, {"name": "created_at", "dtype": "timestamp[s]"}, {"name": "updated_at", "dtype": "timestamp[s]"}, {"name": "due_on", "dtype": "null"}, {"name": "closed_at", "dtype": "null"}]}, {"name": "comments", "sequence": "string"}, {"name": "created_at", "dtype": "timestamp[s]"}, {"name": "updated_at", "dtype": "timestamp[s]"}, {"name": "closed_at", "dtype": "timestamp[s]"}, {"name": "author_association", "dtype": "string"}, {"name": "active_lock_reason", "dtype": "null"}, {"name": "body", "dtype": "string"}, {"name": "reactions", "struct": [{"name": "url", "dtype": "string"}, {"name": "total_count", "dtype": "int64"}, {"name": "+1", "dtype": "int64"}, {"name": "-1", "dtype": "int64"}, {"name": "laugh", "dtype": "int64"}, {"name": "hooray", "dtype": "int64"}, {"name": "confused", "dtype": "int64"}, {"name": "heart", "dtype": "int64"}, {"name": "rocket", "dtype": "int64"}, {"name": "eyes", "dtype": "int64"}]}, {"name": "timeline_url", "dtype": "string"}, {"name": "performed_via_github_app", "dtype": "null"}, {"name": "state_reason", "dtype": "string"}, {"name": "draft", "dtype": "bool"}, {"name": "pull_request", "struct": [{"name": "url", "dtype": "string"}, {"name": "html_url", "dtype": "string"}, {"name": "diff_url", "dtype": "string"}, {"name": "patch_url", "dtype": "string"}, {"name": "merged_at", "dtype": "timestamp[s]"}]}, {"name": "is_pull_request", "dtype": "bool"}], "splits": [{"name": "train", "num_bytes": 26192134, "num_examples": 3817}], "download_size": 7664986, "dataset_size": 26192134}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-16T08:38:39+00:00 |
aca821e8944ae52ff17b607fd4b6afe5807c7ca2 | shnl/qg-example | [
"region:us"
] | 2024-02-16T08:35:28+00:00 | {} | 2024-02-17T10:17:54+00:00 |
|
c235cd77c5d598eca11ee5f1916c2cd6ad62618c |
# Dataset Card for Evaluation run of fzzhang/Marcoroni-neural-chat-7B-v2_gsm8k_merged
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [fzzhang/Marcoroni-neural-chat-7B-v2_gsm8k_merged](https://huggingface.co/fzzhang/Marcoroni-neural-chat-7B-v2_gsm8k_merged) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_fzzhang__Marcoroni-neural-chat-7B-v2_gsm8k_merged",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-16T08:43:34.747997](https://huggingface.co/datasets/open-llm-leaderboard/details_fzzhang__Marcoroni-neural-chat-7B-v2_gsm8k_merged/blob/main/results_2024-02-16T08-43-34.747997.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6443352125855402,
"acc_stderr": 0.03223578518541491,
"acc_norm": 0.6464316260111327,
"acc_norm_stderr": 0.03288034667596033,
"mc1": 0.36964504283965727,
"mc1_stderr": 0.016898180706973884,
"mc2": 0.5318297182928406,
"mc2_stderr": 0.015213885422385947
},
"harness|arc:challenge|25": {
"acc": 0.6109215017064846,
"acc_stderr": 0.014247309976045607,
"acc_norm": 0.6578498293515358,
"acc_norm_stderr": 0.013864152159177275
},
"harness|hellaswag|10": {
"acc": 0.6619199362676758,
"acc_stderr": 0.004720891597174729,
"acc_norm": 0.8526190001991635,
"acc_norm_stderr": 0.0035376085010691773
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.02863723563980089,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.02863723563980089
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252603,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252603
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.03533133389323657,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.03533133389323657
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.47058823529411764,
"acc_stderr": 0.049665709039785295,
"acc_norm": 0.47058823529411764,
"acc_norm_stderr": 0.049665709039785295
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6085106382978723,
"acc_stderr": 0.03190701242326812,
"acc_norm": 0.6085106382978723,
"acc_norm_stderr": 0.03190701242326812
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5175438596491229,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.5175438596491229,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5103448275862069,
"acc_stderr": 0.04165774775728763,
"acc_norm": 0.5103448275862069,
"acc_norm_stderr": 0.04165774775728763
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.025355741263055256,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.025355741263055256
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7709677419354839,
"acc_stderr": 0.023904914311782648,
"acc_norm": 0.7709677419354839,
"acc_norm_stderr": 0.023904914311782648
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8080808080808081,
"acc_stderr": 0.02805779167298901,
"acc_norm": 0.8080808080808081,
"acc_norm_stderr": 0.02805779167298901
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6512820512820513,
"acc_stderr": 0.02416278028401772,
"acc_norm": 0.6512820512820513,
"acc_norm_stderr": 0.02416278028401772
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3851851851851852,
"acc_stderr": 0.029670906124630882,
"acc_norm": 0.3851851851851852,
"acc_norm_stderr": 0.029670906124630882
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943343,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943343
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8385321100917431,
"acc_stderr": 0.015776239256163248,
"acc_norm": 0.8385321100917431,
"acc_norm_stderr": 0.015776239256163248
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8186274509803921,
"acc_stderr": 0.027044621719474082,
"acc_norm": 0.8186274509803921,
"acc_norm_stderr": 0.027044621719474082
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601443,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601443
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.03138147637575499,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.03138147637575499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7251908396946565,
"acc_stderr": 0.039153454088478354,
"acc_norm": 0.7251908396946565,
"acc_norm_stderr": 0.039153454088478354
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.04726835553719099,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.04726835553719099
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.0398913985953177,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.0398913985953177
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8504273504273504,
"acc_stderr": 0.023365051491753715,
"acc_norm": 0.8504273504273504,
"acc_norm_stderr": 0.023365051491753715
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8263090676883781,
"acc_stderr": 0.01354741565866226,
"acc_norm": 0.8263090676883781,
"acc_norm_stderr": 0.01354741565866226
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.023948512905468358,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.023948512905468358
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4201117318435754,
"acc_stderr": 0.016507671073256402,
"acc_norm": 0.4201117318435754,
"acc_norm_stderr": 0.016507671073256402
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7320261437908496,
"acc_stderr": 0.02536060379624256,
"acc_norm": 0.7320261437908496,
"acc_norm_stderr": 0.02536060379624256
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.026003301117885142,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.026003301117885142
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.023788583551658533,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.023788583551658533
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4645390070921986,
"acc_stderr": 0.02975238965742705,
"acc_norm": 0.4645390070921986,
"acc_norm_stderr": 0.02975238965742705
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44328552803129073,
"acc_stderr": 0.012687818419599923,
"acc_norm": 0.44328552803129073,
"acc_norm_stderr": 0.012687818419599923
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6727941176470589,
"acc_stderr": 0.028501452860396556,
"acc_norm": 0.6727941176470589,
"acc_norm_stderr": 0.028501452860396556
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.0189754279205072,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.0189754279205072
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.746938775510204,
"acc_stderr": 0.027833023871399677,
"acc_norm": 0.746938775510204,
"acc_norm_stderr": 0.027833023871399677
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616914,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616914
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.36964504283965727,
"mc1_stderr": 0.016898180706973884,
"mc2": 0.5318297182928406,
"mc2_stderr": 0.015213885422385947
},
"harness|winogrande|5": {
"acc": 0.7892659826361483,
"acc_stderr": 0.011462046419710676
},
"harness|gsm8k|5": {
"acc": 0.6133434420015162,
"acc_stderr": 0.013413955095965307
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_fzzhang__Marcoroni-neural-chat-7B-v2_gsm8k_merged | [
"region:us"
] | 2024-02-16T08:45:55+00:00 | {"pretty_name": "Evaluation run of fzzhang/Marcoroni-neural-chat-7B-v2_gsm8k_merged", "dataset_summary": "Dataset automatically created during the evaluation run of model [fzzhang/Marcoroni-neural-chat-7B-v2_gsm8k_merged](https://huggingface.co/fzzhang/Marcoroni-neural-chat-7B-v2_gsm8k_merged) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_fzzhang__Marcoroni-neural-chat-7B-v2_gsm8k_merged\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-16T08:43:34.747997](https://huggingface.co/datasets/open-llm-leaderboard/details_fzzhang__Marcoroni-neural-chat-7B-v2_gsm8k_merged/blob/main/results_2024-02-16T08-43-34.747997.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6443352125855402,\n \"acc_stderr\": 0.03223578518541491,\n \"acc_norm\": 0.6464316260111327,\n \"acc_norm_stderr\": 0.03288034667596033,\n \"mc1\": 0.36964504283965727,\n \"mc1_stderr\": 0.016898180706973884,\n \"mc2\": 0.5318297182928406,\n \"mc2_stderr\": 0.015213885422385947\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6109215017064846,\n \"acc_stderr\": 0.014247309976045607,\n \"acc_norm\": 0.6578498293515358,\n \"acc_norm_stderr\": 0.013864152159177275\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6619199362676758,\n \"acc_stderr\": 0.004720891597174729,\n \"acc_norm\": 0.8526190001991635,\n \"acc_norm_stderr\": 0.0035376085010691773\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.02863723563980089,\n \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.02863723563980089\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252603,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252603\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6878612716763006,\n \"acc_stderr\": 0.03533133389323657,\n \"acc_norm\": 0.6878612716763006,\n \"acc_norm_stderr\": 0.03533133389323657\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.47058823529411764,\n \"acc_stderr\": 0.049665709039785295,\n \"acc_norm\": 0.47058823529411764,\n \"acc_norm_stderr\": 0.049665709039785295\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6085106382978723,\n \"acc_stderr\": 0.03190701242326812,\n \"acc_norm\": 0.6085106382978723,\n \"acc_norm_stderr\": 0.03190701242326812\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5175438596491229,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.5175438596491229,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728763,\n \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728763\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.025355741263055256,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.025355741263055256\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7709677419354839,\n \"acc_stderr\": 0.023904914311782648,\n \"acc_norm\": 0.7709677419354839,\n \"acc_norm_stderr\": 0.023904914311782648\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8080808080808081,\n \"acc_stderr\": 0.02805779167298901,\n \"acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.02805779167298901\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6512820512820513,\n \"acc_stderr\": 0.02416278028401772,\n \"acc_norm\": 0.6512820512820513,\n \"acc_norm_stderr\": 0.02416278028401772\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3851851851851852,\n \"acc_stderr\": 0.029670906124630882,\n \"acc_norm\": 0.3851851851851852,\n \"acc_norm_stderr\": 0.029670906124630882\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.304635761589404,\n \"acc_stderr\": 0.03757949922943343,\n \"acc_norm\": 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943343\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8385321100917431,\n \"acc_stderr\": 0.015776239256163248,\n \"acc_norm\": 0.8385321100917431,\n \"acc_norm_stderr\": 0.015776239256163248\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5416666666666666,\n \"acc_stderr\": 0.03398110890294636,\n \"acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.03398110890294636\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8186274509803921,\n \"acc_stderr\": 0.027044621719474082,\n \"acc_norm\": 0.8186274509803921,\n \"acc_norm_stderr\": 0.027044621719474082\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601443,\n \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601443\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n \"acc_stderr\": 0.03138147637575499,\n \"acc_norm\": 0.6771300448430493,\n \"acc_norm_stderr\": 0.03138147637575499\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7251908396946565,\n \"acc_stderr\": 0.039153454088478354,\n \"acc_norm\": 0.7251908396946565,\n \"acc_norm_stderr\": 0.039153454088478354\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n \"acc_stderr\": 0.04726835553719099,\n \"acc_norm\": 0.45535714285714285,\n \"acc_norm_stderr\": 0.04726835553719099\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.0398913985953177,\n \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.0398913985953177\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8504273504273504,\n \"acc_stderr\": 0.023365051491753715,\n \"acc_norm\": 0.8504273504273504,\n \"acc_norm_stderr\": 0.023365051491753715\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n \"acc_stderr\": 0.01354741565866226,\n \"acc_norm\": 0.8263090676883781,\n \"acc_norm_stderr\": 0.01354741565866226\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.023948512905468358,\n \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.023948512905468358\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4201117318435754,\n \"acc_stderr\": 0.016507671073256402,\n \"acc_norm\": 0.4201117318435754,\n \"acc_norm_stderr\": 0.016507671073256402\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7320261437908496,\n \"acc_stderr\": 0.02536060379624256,\n \"acc_norm\": 0.7320261437908496,\n \"acc_norm_stderr\": 0.02536060379624256\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n \"acc_stderr\": 0.026003301117885142,\n \"acc_norm\": 0.7009646302250804,\n \"acc_norm_stderr\": 0.026003301117885142\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.023788583551658533,\n \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.023788583551658533\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4645390070921986,\n \"acc_stderr\": 0.02975238965742705,\n \"acc_norm\": 0.4645390070921986,\n \"acc_norm_stderr\": 0.02975238965742705\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44328552803129073,\n \"acc_stderr\": 0.012687818419599923,\n \"acc_norm\": 0.44328552803129073,\n \"acc_norm_stderr\": 0.012687818419599923\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.028501452860396556,\n \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.028501452860396556\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.673202614379085,\n \"acc_stderr\": 0.0189754279205072,\n \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.0189754279205072\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.746938775510204,\n \"acc_stderr\": 0.027833023871399677,\n \"acc_norm\": 0.746938775510204,\n \"acc_norm_stderr\": 0.027833023871399677\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.02587064676616914,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.02587064676616914\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.36964504283965727,\n \"mc1_stderr\": 0.016898180706973884,\n \"mc2\": 0.5318297182928406,\n \"mc2_stderr\": 0.015213885422385947\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7892659826361483,\n \"acc_stderr\": 0.011462046419710676\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6133434420015162,\n \"acc_stderr\": 0.013413955095965307\n }\n}\n```", "repo_url": "https://huggingface.co/fzzhang/Marcoroni-neural-chat-7B-v2_gsm8k_merged", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|arc:challenge|25_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|gsm8k|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hellaswag|10_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-16T08-43-34.747997.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["**/details_harness|winogrande|5_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-16T08-43-34.747997.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_16T08_43_34.747997", "path": ["results_2024-02-16T08-43-34.747997.parquet"]}, {"split": "latest", "path": ["results_2024-02-16T08-43-34.747997.parquet"]}]}]} | 2024-02-16T08:46:16+00:00 |
bbbd3c5300b166e06cc5bfda2a91f0fee0072fb9 | hesum-anonymous/HeSum | [
"region:us"
] | 2024-02-16T08:48:35+00:00 | {"dataset_info": {"features": [{"name": "summary", "dtype": "string"}, {"name": "article", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 98933510, "num_examples": 8000}, {"name": "validation", "num_bytes": 12217867, "num_examples": 1000}, {"name": "test", "num_bytes": 13227741, "num_examples": 1000}], "download_size": 63278488, "dataset_size": 124379118}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2024-02-16T08:51:20+00:00 |
|
738924d6dd2fa02c333859f79ce245a257da3aa0 | pparshakov/grdmr_test_zoo_648292 | [
"license:mit",
"region:us"
] | 2024-02-16T08:51:56+00:00 | {"license": "mit"} | 2024-02-16T08:51:56+00:00 |
|
4bcf11cdce4babc936535df6a2125c42850180f9 | sanjay920/cortex-codellama | [
"region:us"
] | 2024-02-16T08:55:16+00:00 | {"dataset_info": {"features": [{"name": "topic", "dtype": "string"}, {"name": "views", "dtype": "int64"}, {"name": "source", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "skip_prompt_formatting", "dtype": "bool"}, {"name": "idx", "dtype": "string"}, {"name": "model", "dtype": "string"}, {"name": "language", "dtype": "string"}, {"name": "hash", "sequence": "int64"}, {"name": "model_name", "dtype": "string"}, {"name": "custom_instruction", "dtype": "bool"}, {"name": "system_prompt", "dtype": "string"}, {"name": "id", "dtype": "string"}, {"name": "category", "dtype": "string"}, {"name": "avatarUrl", "dtype": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 3572547378, "num_examples": 1996447}], "download_size": 1817368547, "dataset_size": 3572547378}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-17T00:30:26+00:00 |
|
924dc1115c591e8b2c262d4488c795bc456d18a0 | sofiapaklina/grdmr_test_zoo_648292 | [
"task_categories:text-generation",
"task_categories:text2text-generation",
"size_categories:10K<n<100K",
"language:ru",
"license:cc-by-4.0",
"chat",
"region:us"
] | 2024-02-16T08:57:36+00:00 | {"language": ["ru"], "license": "cc-by-4.0", "size_categories": ["10K<n<100K"], "task_categories": ["text-generation", "text2text-generation"], "dataset_info": {"features": [{"name": "instruction", "dtype": "string"}, {"name": "input", "dtype": "string"}, {"name": "output", "dtype": "string"}]}, "tags": ["chat"]} | 2024-02-16T10:24:52+00:00 |
|
f325033178bd3524da65f172d96868f93209dedc | bjak/common_voice_13_0_thai_small_pseudo_labelled | [
"region:us"
] | 2024-02-16T09:10:42+00:00 | {} | 2024-02-16T09:14:41+00:00 |
|
2926132daebda3e0714f5f46d7719af02fc79654 | ineoApp/dataset-02-16 | [
"region:us"
] | 2024-02-16T09:10:53+00:00 | {"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "image", "dtype": "image"}, {"name": "bboxes", "sequence": {"sequence": "int64"}}, {"name": "ner_tags", "sequence": {"class_label": {"names": {"0": "O", "1": "numero facture", "2": "Telephone", "3": "Email", "4": "Site web", "5": "RC", "6": "CNSS", "7": "TP", "8": "Mode de paiement", "9": "fournisseur", "10": "date facture", "11": "date limite", "12": "montant ht", "13": "montant ttc", "14": "tva", "15": "prix tva", "16": "addresse", "17": "reference", "18": "Devise", "19": "ICE fournisseur", "20": "IF fournisseur", "21": "Condition de paiement", "22": "informations", "23": "art1 designation", "24": "art1 quantite", "25": "art1 prix unit", "26": "art1 tva", "27": "art1 montant ht", "28": "art1 Article", "29": "art1 taux de remise", "30": "art2 designation", "31": "art2 quantite", "32": "art2 prix unit", "33": "art2 tva", "34": "art2 montant ht", "35": "art2 Article", "36": "art2 taux de remise", "37": "art3 designation", "38": "art3 quantite", "39": "art3 prix unit", "40": "art3 tva", "41": "art3 montant ht", "42": "art3 Article", "43": "art3 taux de remise", "44": "art4 designation", "45": "art4 quantite", "46": "art4 prix unit", "47": "art4 tva", "48": "art4 montant ht", "49": "art4 Article", "50": "art4 taux de remise", "51": "art5 designation", "52": "art5 quantite", "53": "art5 prix unit", "54": "art5 tva", "55": "art5 montant ht", "56": "art5 Article", "57": "art5 taux de remise", "58": "art6 designation", "59": "art6 quantite", "60": "art6 prix unit", "61": "art6 tva", "62": "art6 montant ht", "63": "art6 Article", "64": "art6 taux de remise", "65": "art7 designation", "66": "art7 quantite", "67": "art7 prix unit", "68": "art7 tva", "69": "art7 montant ht", "70": "art7 Article", "71": "art7 taux de remise", "72": "art8 designation", "73": "art8 quantite", "74": "art8 prix unit", "75": "art8 tva", "76": "art8 montant ht", "77": "art8 Article", "78": "art8 taux de remise", "79": "art9 designation", "80": "art9 quantite", "81": "art9 prix unit", "82": "art9 tva", "83": "art9 montant ht", "84": "art9 Article", "85": "art9 taux de remise", "86": "art10 designation", "87": "art10 quantite", "88": "art10 prix unit", "89": "art10 tva", "90": "art10 montant ht", "91": "art10 Article", "92": "art10 taux de remise", "93": "art11 designation", "94": "art11 quantite", "95": "art11 prix unit", "96": "art11 tva", "97": "art11 montant ht", "98": "art11 Article", "99": "art11 taux de remise", "100": "art12 designation", "101": "art12 quantite", "102": "art12 prix unit", "103": "art12 tva", "104": "art12 montant ht", "105": "art12 Article", "106": "art12 taux de remise", "107": "art13 designation", "108": "art13 quantite", "109": "art13 prix unit", "110": "art13 tva", "111": "art13 montant ht", "112": "art13 Article", "113": "art13 taux de remise", "114": "art14 designation", "115": "art14 quantite", "116": "art14 prix unit", "117": "art14 tva", "118": "art14 montant ht", "119": "art14 Article", "120": "art14 taux de remise"}}}}, {"name": "tokens", "sequence": "string"}], "splits": [{"name": "train", "num_bytes": 362851126.49572647, "num_examples": 280}, {"name": "test", "num_bytes": 92008678.5042735, "num_examples": 71}], "download_size": 435978677, "dataset_size": 454859805.0}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2024-02-16T12:20:39+00:00 |
|
7a94aa1e1dd0996e67122e5b714f61db0bb57177 | Doowon96/news_comment_archive | [
"region:us"
] | 2024-02-16T09:14:55+00:00 | {} | 2024-02-16T09:22:51+00:00 |
|
22282bbe067789a4c01ead7604018cfe8bc7af4b | indiehackers/tenglish_dataset | [
"region:us"
] | 2024-02-16T09:25:04+00:00 | {"dataset_info": [{"config_name": "telugu_asr", "features": [{"name": "text", "dtype": "string"}, {"name": "translit", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 68400135, "num_examples": 209270}], "download_size": 34092313, "dataset_size": 68400135}, {"config_name": "telugu_nlp", "features": [{"name": "text", "dtype": "string"}, {"name": "translit", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 553825446, "num_examples": 47415}], "download_size": 251907089, "dataset_size": 553825446}, {"config_name": "wikipedia", "features": [{"name": "text", "dtype": "string"}, {"name": "translit", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1025395437, "num_examples": 87854}], "download_size": 341788327, "dataset_size": 1025395437}], "configs": [{"config_name": "telugu_asr", "data_files": [{"split": "train", "path": "telugu_asr/train-*"}]}, {"config_name": "telugu_nlp", "data_files": [{"split": "train", "path": "telugu_nlp/train-*"}]}, {"config_name": "wikipedia", "data_files": [{"split": "train", "path": "wikipedia/train-*"}]}]} | 2024-02-16T19:12:24+00:00 |
|
d5a41ed2ad8394928a3a9b10384d00083c662ea2 | jamesagilesoda/ko-corpus-cleaned-12653878-chunk_00040_00059 | [
"region:us"
] | 2024-02-16T09:28:19+00:00 | {"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 10004240941, "num_examples": 1259100}], "download_size": 5763871904, "dataset_size": 10004240941}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-16T09:36:43+00:00 |
|
fcec12698818c1deb71f5588fb40776607fc89dc | Saurav1996/guanaco-llama2-1k | [
"region:us"
] | 2024-02-16T09:32:58+00:00 | {"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1654448, "num_examples": 1000}], "download_size": 966692, "dataset_size": 1654448}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-16T09:32:59+00:00 |
|
c346332921de87fe8e9ac35f435df659c79c114b | ammaralam/medical_Ar | [
"region:us"
] | 2024-02-16T09:34:05+00:00 | {"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 36864, "num_examples": 98}], "download_size": 9832, "dataset_size": 36864}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-16T09:34:06+00:00 |
|
0aab926c4653e61c7f66cb8b1f4b816a8d296d6c | saracandu/harry-potter-trivia-ai | [
"region:us"
] | 2024-02-16T09:36:05+00:00 | {} | 2024-02-16T09:37:04+00:00 |
|
d508bce6e70e866b3eef9682770691579bea6ed8 | Brendan/multiwoz_turns_v22_test_set_partitioned_smaller | [
"region:us"
] | 2024-02-16T09:39:42+00:00 | {"dataset_info": {"features": [{"name": "dialogue_id", "dtype": "string"}, {"name": "turn_id", "dtype": "int8"}, {"name": "domains", "sequence": "string"}, {"name": "system_utterances", "sequence": "string"}, {"name": "user_utterances", "sequence": "string"}, {"name": "slot_values", "struct": [{"name": "hotel", "struct": [{"name": "price range", "dtype": "string"}, {"name": "type", "dtype": "string"}, {"name": "parking", "dtype": "string"}, {"name": "book day", "dtype": "string"}, {"name": "book people", "dtype": "string"}, {"name": "book stay", "dtype": "string"}, {"name": "stars", "dtype": "string"}, {"name": "internet", "dtype": "string"}, {"name": "name", "dtype": "string"}, {"name": "area", "dtype": "string"}]}, {"name": "train", "struct": [{"name": "arrive by", "dtype": "string"}, {"name": "departure", "dtype": "string"}, {"name": "day", "dtype": "string"}, {"name": "book people", "dtype": "string"}, {"name": "leave at", "dtype": "string"}, {"name": "destination", "dtype": "string"}]}, {"name": "attraction", "struct": [{"name": "area", "dtype": "string"}, {"name": "name", "dtype": "string"}, {"name": "type", "dtype": "string"}]}, {"name": "restaurant", "struct": [{"name": "price range", "dtype": "string"}, {"name": "area", "dtype": "string"}, {"name": "food", "dtype": "string"}, {"name": "name", "dtype": "string"}, {"name": "book day", "dtype": "string"}, {"name": "book people", "dtype": "string"}, {"name": "book time", "dtype": "string"}]}, {"name": "hospital", "struct": [{"name": "department", "dtype": "string"}]}, {"name": "taxi", "struct": [{"name": "leave at", "dtype": "string"}, {"name": "destination", "dtype": "string"}, {"name": "departure", "dtype": "string"}, {"name": "arrive by", "dtype": "string"}]}, {"name": "bus", "struct": [{"name": "departure", "dtype": "string"}, {"name": "destination", "dtype": "string"}, {"name": "leave at", "dtype": "string"}, {"name": "day", "dtype": "string"}]}, {"name": "police", "struct": [{"name": "name", "dtype": "string"}]}]}, {"name": "turn_slot_values", "struct": [{"name": "hotel", "struct": [{"name": "price range", "dtype": "string"}, {"name": "type", "dtype": "string"}, {"name": "parking", "dtype": "string"}, {"name": "book day", "dtype": "string"}, {"name": "book people", "dtype": "string"}, {"name": "book stay", "dtype": "string"}, {"name": "stars", "dtype": "string"}, {"name": "internet", "dtype": "string"}, {"name": "name", "dtype": "string"}, {"name": "area", "dtype": "string"}]}, {"name": "train", "struct": [{"name": "arrive by", "dtype": "string"}, {"name": "departure", "dtype": "string"}, {"name": "day", "dtype": "string"}, {"name": "book people", "dtype": "string"}, {"name": "leave at", "dtype": "string"}, {"name": "destination", "dtype": "string"}]}, {"name": "attraction", "struct": [{"name": "area", "dtype": "string"}, {"name": "name", "dtype": "string"}, {"name": "type", "dtype": "string"}]}, {"name": "restaurant", "struct": [{"name": "price range", "dtype": "string"}, {"name": "area", "dtype": "string"}, {"name": "food", "dtype": "string"}, {"name": "name", "dtype": "string"}, {"name": "book day", "dtype": "string"}, {"name": "book people", "dtype": "string"}, {"name": "book time", "dtype": "string"}]}, {"name": "hospital", "struct": [{"name": "department", "dtype": "string"}]}, {"name": "taxi", "struct": [{"name": "leave at", "dtype": "string"}, {"name": "destination", "dtype": "string"}, {"name": "departure", "dtype": "string"}, {"name": "arrive by", "dtype": "string"}]}, {"name": "bus", "struct": [{"name": "departure", "dtype": "string"}, {"name": "destination", "dtype": "string"}, {"name": "leave at", "dtype": "string"}, {"name": "day", "dtype": "string"}]}, {"name": "police", "struct": [{"name": "name", "dtype": "string"}]}]}, {"name": "last_slot_values", "struct": [{"name": "hotel", "struct": [{"name": "price range", "dtype": "string"}, {"name": "type", "dtype": "string"}, {"name": "parking", "dtype": "string"}, {"name": "book day", "dtype": "string"}, {"name": "book people", "dtype": "string"}, {"name": "book stay", "dtype": "string"}, {"name": "stars", "dtype": "string"}, {"name": "internet", "dtype": "string"}, {"name": "name", "dtype": "string"}, {"name": "area", "dtype": "string"}]}, {"name": "train", "struct": [{"name": "arrive by", "dtype": "string"}, {"name": "departure", "dtype": "string"}, {"name": "day", "dtype": "string"}, {"name": "book people", "dtype": "string"}, {"name": "leave at", "dtype": "string"}, {"name": "destination", "dtype": "string"}]}, {"name": "attraction", "struct": [{"name": "area", "dtype": "string"}, {"name": "name", "dtype": "string"}, {"name": "type", "dtype": "string"}]}, {"name": "restaurant", "struct": [{"name": "price range", "dtype": "string"}, {"name": "area", "dtype": "string"}, {"name": "food", "dtype": "string"}, {"name": "name", "dtype": "string"}, {"name": "book day", "dtype": "string"}, {"name": "book people", "dtype": "string"}, {"name": "book time", "dtype": "string"}]}, {"name": "hospital", "struct": [{"name": "department", "dtype": "string"}]}, {"name": "taxi", "struct": [{"name": "leave at", "dtype": "string"}, {"name": "destination", "dtype": "string"}, {"name": "departure", "dtype": "string"}, {"name": "arrive by", "dtype": "string"}]}, {"name": "bus", "struct": [{"name": "departure", "dtype": "string"}, {"name": "destination", "dtype": "string"}, {"name": "leave at", "dtype": "string"}, {"name": "day", "dtype": "string"}]}, {"name": "police", "struct": [{"name": "name", "dtype": "string"}]}]}, {"name": "last_system_response_acts", "sequence": "string"}, {"name": "system_response_acts", "sequence": "string"}, {"name": "system_response", "dtype": "string"}], "splits": [{"name": "test_0_25", "num_bytes": 301839.2740097667, "num_examples": 197}, {"name": "test_25_50", "num_bytes": 271195.6928920239, "num_examples": 177}, {"name": "test_50_75", "num_bytes": 262002.61855670103, "num_examples": 171}, {"name": "test_75_100", "num_bytes": 281920.9462832339, "num_examples": 184}, {"name": "test_100_125", "num_bytes": 281920.9462832339, "num_examples": 184}, {"name": "test_125_150", "num_bytes": 277324.40911557246, "num_examples": 181}, {"name": "test_150_175", "num_bytes": 306435.8111774281, "num_examples": 200}, {"name": "test_175_200", "num_bytes": 262002.61855670103, "num_examples": 171}, {"name": "test_200_225", "num_bytes": 280388.7672273467, "num_examples": 183}, {"name": "test_225_250", "num_bytes": 275792.2300596853, "num_examples": 180}, {"name": "test_250_275", "num_bytes": 280388.7672273467, "num_examples": 183}, {"name": "test_275_300", "num_bytes": 281920.9462832339, "num_examples": 184}, {"name": "test_300_325", "num_bytes": 265066.9766684753, "num_examples": 173}, {"name": "test_325_350", "num_bytes": 304903.63212154096, "num_examples": 199}, {"name": "test_350_375", "num_bytes": 283453.125339121, "num_examples": 185}, {"name": "test_375_400", "num_bytes": 271195.6928920239, "num_examples": 177}, {"name": "test_400_425", "num_bytes": 266599.15572436247, "num_examples": 174}, {"name": "test_425_450", "num_bytes": 258938.26044492674, "num_examples": 169}, {"name": "test_450_475", "num_bytes": 295710.5577862181, "num_examples": 193}, {"name": "test_475_500", "num_bytes": 262002.61855670103, "num_examples": 171}, {"name": "test_500_525", "num_bytes": 297242.7368421053, "num_examples": 194}, {"name": "test_525_550", "num_bytes": 298774.9158979924, "num_examples": 195}, {"name": "test_550_575", "num_bytes": 303371.4530656538, "num_examples": 198}, {"name": "test_575_600", "num_bytes": 283453.125339121, "num_examples": 185}, {"name": "test_600_625", "num_bytes": 326354.13890396093, "num_examples": 213}, {"name": "test_625_650", "num_bytes": 255873.90233315248, "num_examples": 167}, {"name": "test_650_675", "num_bytes": 281920.9462832339, "num_examples": 184}, {"name": "test_675_700", "num_bytes": 314096.7064568638, "num_examples": 205}, {"name": "test_700_725", "num_bytes": 314096.7064568638, "num_examples": 205}, {"name": "test_725_750", "num_bytes": 268131.33478024957, "num_examples": 175}, {"name": "test_750_775", "num_bytes": 278856.58817145956, "num_examples": 182}, {"name": "test_775_800", "num_bytes": 237487.7536625068, "num_examples": 155}, {"name": "test_800_825", "num_bytes": 280388.7672273467, "num_examples": 183}, {"name": "test_825_850", "num_bytes": 286517.4834508953, "num_examples": 187}, {"name": "test_850_875", "num_bytes": 286517.4834508953, "num_examples": 187}, {"name": "test_875_900", "num_bytes": 309500.1692892024, "num_examples": 202}, {"name": "test_900_925", "num_bytes": 280388.7672273467, "num_examples": 183}, {"name": "test_925_950", "num_bytes": 278856.58817145956, "num_examples": 182}, {"name": "test_950_975", "num_bytes": 274260.05100379814, "num_examples": 179}, {"name": "test_975_1000", "num_bytes": 268131.33478024957, "num_examples": 175}], "download_size": 4088912, "dataset_size": 11295223.999999998}, "configs": [{"config_name": "default", "data_files": [{"split": "test_0_25", "path": "data/test_0_25-*"}, {"split": "test_25_50", "path": "data/test_25_50-*"}, {"split": "test_50_75", "path": "data/test_50_75-*"}, {"split": "test_75_100", "path": "data/test_75_100-*"}, {"split": "test_100_125", "path": "data/test_100_125-*"}, {"split": "test_125_150", "path": "data/test_125_150-*"}, {"split": "test_150_175", "path": "data/test_150_175-*"}, {"split": "test_175_200", "path": "data/test_175_200-*"}, {"split": "test_200_225", "path": "data/test_200_225-*"}, {"split": "test_225_250", "path": "data/test_225_250-*"}, {"split": "test_250_275", "path": "data/test_250_275-*"}, {"split": "test_275_300", "path": "data/test_275_300-*"}, {"split": "test_300_325", "path": "data/test_300_325-*"}, {"split": "test_325_350", "path": "data/test_325_350-*"}, {"split": "test_350_375", "path": "data/test_350_375-*"}, {"split": "test_375_400", "path": "data/test_375_400-*"}, {"split": "test_400_425", "path": "data/test_400_425-*"}, {"split": "test_425_450", "path": "data/test_425_450-*"}, {"split": "test_450_475", "path": "data/test_450_475-*"}, {"split": "test_475_500", "path": "data/test_475_500-*"}, {"split": "test_500_525", "path": "data/test_500_525-*"}, {"split": "test_525_550", "path": "data/test_525_550-*"}, {"split": "test_550_575", "path": "data/test_550_575-*"}, {"split": "test_575_600", "path": "data/test_575_600-*"}, {"split": "test_600_625", "path": "data/test_600_625-*"}, {"split": "test_625_650", "path": "data/test_625_650-*"}, {"split": "test_650_675", "path": "data/test_650_675-*"}, {"split": "test_675_700", "path": "data/test_675_700-*"}, {"split": "test_700_725", "path": "data/test_700_725-*"}, {"split": "test_725_750", "path": "data/test_725_750-*"}, {"split": "test_750_775", "path": "data/test_750_775-*"}, {"split": "test_775_800", "path": "data/test_775_800-*"}, {"split": "test_800_825", "path": "data/test_800_825-*"}, {"split": "test_825_850", "path": "data/test_825_850-*"}, {"split": "test_850_875", "path": "data/test_850_875-*"}, {"split": "test_875_900", "path": "data/test_875_900-*"}, {"split": "test_900_925", "path": "data/test_900_925-*"}, {"split": "test_925_950", "path": "data/test_925_950-*"}, {"split": "test_950_975", "path": "data/test_950_975-*"}, {"split": "test_975_1000", "path": "data/test_975_1000-*"}]}]} | 2024-02-16T09:40:13+00:00 |
|
239147154af21d4c751b6a276b7e8c70659442b4 | NYTK/alpaca_hu_2k | [
"license:cc-by-nc-4.0",
"region:us"
] | 2024-02-16T09:41:30+00:00 | {"license": "cc-by-nc-4.0"} | 2024-02-16T09:42:35+00:00 |
|
05c5abd3971b12e64725d2a6cc9a093aee2b587e | # Next Relation Prediction on the UMLS KG
The dataset trains and evaluates a model that generates the subsequent relation to pursue (if available) based on a given MedQA question and the relations explored thus far, otherwise indicating the end (END). | neural-subgraph-retrieval/umls-next-relation-prediction | [
"license:apache-2.0",
"region:us"
] | 2024-02-16T09:44:07+00:00 | {"license": "apache-2.0", "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "eval", "path": "data/eval-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "question_id", "dtype": "uint16"}, {"name": "input", "dtype": "string"}, {"name": "output", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 143262, "num_examples": 220}, {"name": "eval", "num_bytes": 18086, "num_examples": 27}, {"name": "test", "num_bytes": 18739, "num_examples": 29}], "download_size": 28244, "dataset_size": 180087}} | 2024-02-16T21:13:48+00:00 |
7985ab4b2ce11ac91cfe32857d6be8576cc8ad75 | vzach/bunny_v0 | [
"license:unknown",
"region:us"
] | 2024-02-16T09:45:58+00:00 | {"license": "unknown", "dataset_info": {"features": [{"name": "messages", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 16159438, "num_examples": 14845}], "download_size": 9510825, "dataset_size": 16159438}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-16T17:14:51+00:00 |
|
4348359d51358dc1e745adf2f7df901b8aaa5dc4 | Tippawan/snm6 | [
"region:us"
] | 2024-02-16T09:50:25+00:00 | {} | 2024-02-16T09:52:02+00:00 |
|
3aab3801fe1986cc4e3532d928a53bd14e1aef12 | # Alpaca-cleaned-gemini-hun
Magyar nyelvű adatkészelet a a stanford alpaca alapján. A kiniduló modell a [yahma/alpaca-cleaned](https://huggingface.co/datasets/yahma/alpaca-cleaned) volt. Itt az utasítások, illetve ahová kellett a bemenetek (nem üres/vannak benne szavak/nem kód), google fordítoval le lettek fordítva. Mivel a fodítás közel sem tökéletes, ezért az angol nyelvű utasításokat is meghagytam.
Ez után a google gemini-pro modelljével legeneráltam a válaszokat, ezek kerültek ebbe az adathalmazba. Ez nagyjából 20 óráig tartott, de legalább ingyenes volt. Idő közben azonban kijött egy új modell, a gemini-pro-1.5, ami valamivel jobb eredményket hoz. Sajnos ekkor már már ment a generálás és nem akartam már leállítani, de elképzelehtő, hogy az adatkészlet egy része már ezzel készült. Ha lesz időm ezzel végig megyek mégegyszer, illetve javítanám a hibákat. | Bazsalanszky/alpaca-cleaned-gemini-hun | [
"language:hu",
"license:cc-by-4.0",
"region:us"
] | 2024-02-16T09:50:44+00:00 | {"language": ["hu"], "license": "cc-by-4.0", "dataset_info": {"features": [{"name": "input", "dtype": "string"}, {"name": "instruction", "dtype": "string"}, {"name": "original_instruction", "dtype": "string"}, {"name": "response", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 52533889, "num_examples": 51760}], "download_size": 30595403, "dataset_size": 52533889}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-16T12:08:17+00:00 |
51a4ce5e27ac2440d1527753f75c01633c2d2a28 | iulusoy/test-images | [
"license:mit",
"region:us"
] | 2024-02-16T09:56:14+00:00 | {"license": "mit"} | 2024-02-16T09:56:52+00:00 |
|
a9fe82e63c6aa8482049a07f13be099c8bfd8016 | ProjectNoob/mini-platypus | [
"region:us"
] | 2024-02-16T09:57:49+00:00 | {"dataset_info": {"features": [{"name": "instruction", "dtype": "string"}, {"name": "output", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 4186564, "num_examples": 1000}], "download_size": 2245921, "dataset_size": 4186564}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-16T09:57:51+00:00 |
|
1c289bec227211b7256decd83aee9150a0290672 | aintech/vdf_configs | [
"license:apache-2.0",
"region:us"
] | 2024-02-16T10:00:14+00:00 | {"license": "apache-2.0"} | 2024-02-16T10:00:14+00:00 |
|
20384e91c050eef3344cd0f5def6c5deca522c02 | daswer123/sentence-danbooru-tags-dataset | [
"license:mit",
"region:us"
] | 2024-02-16T10:00:32+00:00 | {"license": "mit"} | 2024-02-16T16:22:13+00:00 |
|
9b454ece3720362b7010a163f989918dbfdc224c | benayas/banking_augmented_10pct_v1 | [
"region:us"
] | 2024-02-16T10:05:20+00:00 | {"dataset_info": {"features": [{"name": "text", "dtype": "string"}, {"name": "category", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 1022593, "num_examples": 10003}], "download_size": 415819, "dataset_size": 1022593}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-16T10:05:22+00:00 |
|
eafb3ea645d7be3974fc3cd837be29bd46571f41 | suthawadee/receipt_th_train | [
"region:us"
] | 2024-02-16T10:06:18+00:00 | {"dataset_info": {"features": [{"name": "image", "dtype": "string"}, {"name": "ground_truth", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 135948, "num_examples": 80}], "download_size": 39317, "dataset_size": 135948}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-16T10:06:22+00:00 |
|
d8fa64a08d84c1e5a4f7c1ca109873f14a14935a | suthawadee/receipt_th_validation | [
"region:us"
] | 2024-02-16T10:06:23+00:00 | {"dataset_info": {"features": [{"name": "image", "dtype": "string"}, {"name": "ground_truth", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 10831, "num_examples": 10}], "download_size": 8776, "dataset_size": 10831}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-16T10:06:27+00:00 |
|
b1099745a44670d5a3f16b185bc35867e446480e | suthawadee/receipt_th_test | [
"region:us"
] | 2024-02-16T10:06:27+00:00 | {"dataset_info": {"features": [{"name": "image", "dtype": "string"}, {"name": "ground_truth", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 11990, "num_examples": 10}], "download_size": 12152, "dataset_size": 11990}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-16T10:06:30+00:00 |
|
f8fe3ceea57e1fb66b0c27494f727a5ef55cc37d | ErikQQY/example | [
"region:us"
] | 2024-02-16T10:08:21+00:00 | {} | 2024-02-16T10:09:06+00:00 |
|
2c03723d980974867c31b00be7c2b2cd1e2cf0dc | Aerobotics/belly-angle-selection-in-office-apples | [
"region:us"
] | 2024-02-16T10:10:13+00:00 | {"dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "Index", "dtype": "int64"}, {"name": "ccg_id", "dtype": "int64"}, {"name": "ffo_id", "dtype": "int64"}, {"name": "angle", "dtype": "float64"}, {"name": "prod_minor_axis_mm", "dtype": "float64"}, {"name": "prod_major_axis_mm", "dtype": "float64"}, {"name": "size_mm_triangulate_just_the_belly", "dtype": "float64"}, {"name": "size_mm_triangulate_just_the_belly_dropped_z", "dtype": "float64"}, {"name": "size_mm_select_belly_angle_of_reprojected_ellipse", "dtype": "float64"}, {"name": "gt_size_mm", "dtype": "float64"}, {"name": "label", "dtype": "float64"}, {"name": "belly_angle", "dtype": "float64"}], "splits": [{"name": "train", "num_bytes": 1812931.0, "num_examples": 118}], "download_size": 1815660, "dataset_size": 1812931.0}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-16T10:10:17+00:00 |
|
d19819bab6869f78794c66ceff57c9639e844be6 | isp-uv-es/opensr-test | [
"license:mit",
"region:us"
] | 2024-02-16T10:13:49+00:00 | {"license": "mit"} | 2024-02-16T10:32:30+00:00 |
|
91bbc77a286d2607f0ba9ba05871130718655fe8 | jlbaker361/cyberpunk-250-cropped | [
"region:us"
] | 2024-02-16T10:14:24+00:00 | {"dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "text", "dtype": "string"}, {"name": "frame", "dtype": "int64"}, {"name": "title", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 209436505.0, "num_examples": 985}], "download_size": 209402884, "dataset_size": 209436505.0}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-16T10:14:32+00:00 |
|
1c3d25129a5778547e57b01dccb697ae4f422607 |
# Dataset Card for Evaluation run of fzzhang/mistralv1_gsm8k_merged
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [fzzhang/mistralv1_gsm8k_merged](https://huggingface.co/fzzhang/mistralv1_gsm8k_merged) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_fzzhang__mistralv1_gsm8k_merged",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-16T10:12:02.378813](https://huggingface.co/datasets/open-llm-leaderboard/details_fzzhang__mistralv1_gsm8k_merged/blob/main/results_2024-02-16T10-12-02.378813.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6294515756856921,
"acc_stderr": 0.03258557096163295,
"acc_norm": 0.6334878028725921,
"acc_norm_stderr": 0.033239152993573814,
"mc1": 0.26438188494492043,
"mc1_stderr": 0.015438211119522514,
"mc2": 0.39552847129229235,
"mc2_stderr": 0.014053302822098265
},
"harness|arc:challenge|25": {
"acc": 0.5656996587030717,
"acc_stderr": 0.01448470304885736,
"acc_norm": 0.613481228668942,
"acc_norm_stderr": 0.014230084761910471
},
"harness|hellaswag|10": {
"acc": 0.6336387173869747,
"acc_stderr": 0.004808251269682433,
"acc_norm": 0.8311093407687712,
"acc_norm_stderr": 0.0037388962449538187
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.042320736951515885,
"acc_norm": 0.6,
"acc_norm_stderr": 0.042320736951515885
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6578947368421053,
"acc_stderr": 0.03860731599316092,
"acc_norm": 0.6578947368421053,
"acc_norm_stderr": 0.03860731599316092
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.028637235639800886,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.028637235639800886
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7361111111111112,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.7361111111111112,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6936416184971098,
"acc_stderr": 0.03514942551267439,
"acc_norm": 0.6936416184971098,
"acc_norm_stderr": 0.03514942551267439
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4851063829787234,
"acc_stderr": 0.032671518489247764,
"acc_norm": 0.4851063829787234,
"acc_norm_stderr": 0.032671518489247764
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4298245614035088,
"acc_stderr": 0.046570472605949625,
"acc_norm": 0.4298245614035088,
"acc_norm_stderr": 0.046570472605949625
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.025355741263055256,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.025355741263055256
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7548387096774194,
"acc_stderr": 0.02447224384089552,
"acc_norm": 0.7548387096774194,
"acc_norm_stderr": 0.02447224384089552
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.65,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.029857515673386414,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.029857515673386414
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.02463978909770944,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.02463978909770944
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6333333333333333,
"acc_stderr": 0.02443301646605246,
"acc_norm": 0.6333333333333333,
"acc_norm_stderr": 0.02443301646605246
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.028897748741131147,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.028897748741131147
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6512605042016807,
"acc_stderr": 0.030956636328566548,
"acc_norm": 0.6512605042016807,
"acc_norm_stderr": 0.030956636328566548
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.818348623853211,
"acc_stderr": 0.016530617409266868,
"acc_norm": 0.818348623853211,
"acc_norm_stderr": 0.016530617409266868
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5601851851851852,
"acc_stderr": 0.033851779760448106,
"acc_norm": 0.5601851851851852,
"acc_norm_stderr": 0.033851779760448106
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.028626547912437406,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.028626547912437406
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7552742616033755,
"acc_stderr": 0.027985699387036423,
"acc_norm": 0.7552742616033755,
"acc_norm_stderr": 0.027985699387036423
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6636771300448431,
"acc_stderr": 0.031708824268455,
"acc_norm": 0.6636771300448431,
"acc_norm_stderr": 0.031708824268455
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.037683359597287434,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.037683359597287434
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7355371900826446,
"acc_stderr": 0.04026187527591205,
"acc_norm": 0.7355371900826446,
"acc_norm_stderr": 0.04026187527591205
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.04236511258094633,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.04236511258094633
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8084291187739464,
"acc_stderr": 0.014072859310451949,
"acc_norm": 0.8084291187739464,
"acc_norm_stderr": 0.014072859310451949
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7109826589595376,
"acc_stderr": 0.02440517393578323,
"acc_norm": 0.7109826589595376,
"acc_norm_stderr": 0.02440517393578323
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.38100558659217876,
"acc_stderr": 0.016242028834053616,
"acc_norm": 0.38100558659217876,
"acc_norm_stderr": 0.016242028834053616
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.024954184324879905,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.024954184324879905
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6913183279742765,
"acc_stderr": 0.02623696588115326,
"acc_norm": 0.6913183279742765,
"acc_norm_stderr": 0.02623696588115326
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7191358024691358,
"acc_stderr": 0.02500646975579922,
"acc_norm": 0.7191358024691358,
"acc_norm_stderr": 0.02500646975579922
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4716312056737589,
"acc_stderr": 0.029779450957303062,
"acc_norm": 0.4716312056737589,
"acc_norm_stderr": 0.029779450957303062
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.43546284224250326,
"acc_stderr": 0.012663412101248333,
"acc_norm": 0.43546284224250326,
"acc_norm_stderr": 0.012663412101248333
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6691176470588235,
"acc_stderr": 0.028582709753898445,
"acc_norm": 0.6691176470588235,
"acc_norm_stderr": 0.028582709753898445
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.018999707383162673,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.018999707383162673
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.028535560337128445,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.028535560337128445
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454132,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454132
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.26438188494492043,
"mc1_stderr": 0.015438211119522514,
"mc2": 0.39552847129229235,
"mc2_stderr": 0.014053302822098265
},
"harness|winogrande|5": {
"acc": 0.7861089187056038,
"acc_stderr": 0.011524466954090248
},
"harness|gsm8k|5": {
"acc": 0.4799090219863533,
"acc_stderr": 0.01376136177298902
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_fzzhang__mistralv1_gsm8k_merged | [
"region:us"
] | 2024-02-16T10:14:24+00:00 | {"pretty_name": "Evaluation run of fzzhang/mistralv1_gsm8k_merged", "dataset_summary": "Dataset automatically created during the evaluation run of model [fzzhang/mistralv1_gsm8k_merged](https://huggingface.co/fzzhang/mistralv1_gsm8k_merged) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_fzzhang__mistralv1_gsm8k_merged\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-16T10:12:02.378813](https://huggingface.co/datasets/open-llm-leaderboard/details_fzzhang__mistralv1_gsm8k_merged/blob/main/results_2024-02-16T10-12-02.378813.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6294515756856921,\n \"acc_stderr\": 0.03258557096163295,\n \"acc_norm\": 0.6334878028725921,\n \"acc_norm_stderr\": 0.033239152993573814,\n \"mc1\": 0.26438188494492043,\n \"mc1_stderr\": 0.015438211119522514,\n \"mc2\": 0.39552847129229235,\n \"mc2_stderr\": 0.014053302822098265\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5656996587030717,\n \"acc_stderr\": 0.01448470304885736,\n \"acc_norm\": 0.613481228668942,\n \"acc_norm_stderr\": 0.014230084761910471\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6336387173869747,\n \"acc_stderr\": 0.004808251269682433,\n \"acc_norm\": 0.8311093407687712,\n \"acc_norm_stderr\": 0.0037388962449538187\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.042320736951515885,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.042320736951515885\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6578947368421053,\n \"acc_stderr\": 0.03860731599316092,\n \"acc_norm\": 0.6578947368421053,\n \"acc_norm_stderr\": 0.03860731599316092\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.028637235639800886,\n \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.028637235639800886\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7361111111111112,\n \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.7361111111111112,\n \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6936416184971098,\n \"acc_stderr\": 0.03514942551267439,\n \"acc_norm\": 0.6936416184971098,\n \"acc_norm_stderr\": 0.03514942551267439\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4851063829787234,\n \"acc_stderr\": 0.032671518489247764,\n \"acc_norm\": 0.4851063829787234,\n \"acc_norm_stderr\": 0.032671518489247764\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4298245614035088,\n \"acc_stderr\": 0.046570472605949625,\n \"acc_norm\": 0.4298245614035088,\n \"acc_norm_stderr\": 0.046570472605949625\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.025355741263055256,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.025355741263055256\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7548387096774194,\n \"acc_stderr\": 0.02447224384089552,\n \"acc_norm\": 0.7548387096774194,\n \"acc_norm_stderr\": 0.02447224384089552\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.04793724854411019,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.04793724854411019\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7727272727272727,\n \"acc_stderr\": 0.029857515673386414,\n \"acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.029857515673386414\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.02463978909770944,\n \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.02463978909770944\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6333333333333333,\n \"acc_stderr\": 0.02443301646605246,\n \"acc_norm\": 0.6333333333333333,\n \"acc_norm_stderr\": 0.02443301646605246\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131147,\n \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131147\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6512605042016807,\n \"acc_stderr\": 0.030956636328566548,\n \"acc_norm\": 0.6512605042016807,\n \"acc_norm_stderr\": 0.030956636328566548\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.818348623853211,\n \"acc_stderr\": 0.016530617409266868,\n \"acc_norm\": 0.818348623853211,\n \"acc_norm_stderr\": 0.016530617409266868\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5601851851851852,\n \"acc_stderr\": 0.033851779760448106,\n \"acc_norm\": 0.5601851851851852,\n \"acc_norm_stderr\": 0.033851779760448106\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7892156862745098,\n \"acc_stderr\": 0.028626547912437406,\n \"acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.028626547912437406\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7552742616033755,\n \"acc_stderr\": 0.027985699387036423,\n \"acc_norm\": 0.7552742616033755,\n \"acc_norm_stderr\": 0.027985699387036423\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6636771300448431,\n \"acc_stderr\": 0.031708824268455,\n \"acc_norm\": 0.6636771300448431,\n \"acc_norm_stderr\": 0.031708824268455\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.037683359597287434,\n \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.037683359597287434\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7355371900826446,\n \"acc_stderr\": 0.04026187527591205,\n \"acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.04026187527591205\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.04236511258094633,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.04236511258094633\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8084291187739464,\n \"acc_stderr\": 0.014072859310451949,\n \"acc_norm\": 0.8084291187739464,\n \"acc_norm_stderr\": 0.014072859310451949\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7109826589595376,\n \"acc_stderr\": 0.02440517393578323,\n \"acc_norm\": 0.7109826589595376,\n \"acc_norm_stderr\": 0.02440517393578323\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.38100558659217876,\n \"acc_stderr\": 0.016242028834053616,\n \"acc_norm\": 0.38100558659217876,\n \"acc_norm_stderr\": 0.016242028834053616\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7450980392156863,\n \"acc_stderr\": 0.024954184324879905,\n \"acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.024954184324879905\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6913183279742765,\n \"acc_stderr\": 0.02623696588115326,\n \"acc_norm\": 0.6913183279742765,\n \"acc_norm_stderr\": 0.02623696588115326\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7191358024691358,\n \"acc_stderr\": 0.02500646975579922,\n \"acc_norm\": 0.7191358024691358,\n \"acc_norm_stderr\": 0.02500646975579922\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4716312056737589,\n \"acc_stderr\": 0.029779450957303062,\n \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.029779450957303062\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.43546284224250326,\n \"acc_stderr\": 0.012663412101248333,\n \"acc_norm\": 0.43546284224250326,\n \"acc_norm_stderr\": 0.012663412101248333\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.028582709753898445,\n \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.028582709753898445\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6715686274509803,\n \"acc_stderr\": 0.018999707383162673,\n \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.018999707383162673\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128445,\n \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128445\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454132,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454132\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.26438188494492043,\n \"mc1_stderr\": 0.015438211119522514,\n \"mc2\": 0.39552847129229235,\n \"mc2_stderr\": 0.014053302822098265\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7861089187056038,\n \"acc_stderr\": 0.011524466954090248\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4799090219863533,\n \"acc_stderr\": 0.01376136177298902\n }\n}\n```", "repo_url": "https://huggingface.co/fzzhang/mistralv1_gsm8k_merged", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|arc:challenge|25_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|gsm8k|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hellaswag|10_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-16T10-12-02.378813.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["**/details_harness|winogrande|5_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-16T10-12-02.378813.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_16T10_12_02.378813", "path": ["results_2024-02-16T10-12-02.378813.parquet"]}, {"split": "latest", "path": ["results_2024-02-16T10-12-02.378813.parquet"]}]}]} | 2024-02-16T10:14:45+00:00 |
92afa11f07c829c18941df51e508c8ee1ada4b0c | anonymous-acl/aeg_data | [
"region:us"
] | 2024-02-16T10:22:07+00:00 | {} | 2024-02-16T14:50:18+00:00 |
|
9034505b8898a200ab86598e02f5b57203279d11 | Recag/Rp_CommonC_656_1 | [
"region:us"
] | 2024-02-16T10:23:10+00:00 | {} | 2024-02-16T10:27:45+00:00 |
|
5528f5cb650b68dace12c03051b6210a058e5d08 | chavinlo/hdvila1m_group | [
"region:us"
] | 2024-02-16T10:23:22+00:00 | {} | 2024-02-16T10:28:45+00:00 |
|
8363810291e8b3385bd37da8d89596807fecfe30 | loubnabnl/stories_oh_problem | [
"region:us"
] | 2024-02-16T10:25:31+00:00 | {"dataset_info": {"features": [{"name": "prompt_problem_solving_story", "dtype": "string"}, {"name": "category", "dtype": "null"}, {"name": "completion", "dtype": "string"}, {"name": "token_length", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 21376595, "num_examples": 5000}], "download_size": 12467295, "dataset_size": 21376595}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-16T10:26:59+00:00 |
|
6cbb389953f50261faf9eca02431c62aac743caf | Recag/Rp_CommonC_656_2 | [
"region:us"
] | 2024-02-16T10:25:32+00:00 | {} | 2024-02-16T10:27:57+00:00 |
|
4834f84ad682541cb9b1b617dcc2571e7e3577dd | Recag/Rp_CommonC_658_1 | [
"region:us"
] | 2024-02-16T10:26:59+00:00 | {} | 2024-02-16T10:35:02+00:00 |
|
5fb2798473389aec60d11003718f401b6de3143a | Recag/Rp_CommonC_658_2 | [
"region:us"
] | 2024-02-16T10:27:04+00:00 | {} | 2024-02-16T10:31:53+00:00 |
|
e4db1b23586ff8a415f25700ed59d1679ddb0b01 | loubnabnl/stories_oh_children | [
"region:us"
] | 2024-02-16T10:27:23+00:00 | {"dataset_info": {"features": [{"name": "prompt_young_children_story", "dtype": "string"}, {"name": "token_length", "dtype": "int64"}, {"name": "completion", "dtype": "string"}, {"name": "category", "dtype": "null"}], "splits": [{"name": "train", "num_bytes": 17898091, "num_examples": 5000}], "download_size": 8782717, "dataset_size": 17898091}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-16T10:27:23+00:00 |
|
5ba4d0874b33c32ca5f51014b4af3caf5e3174d4 | loubnabnl/stories_oh_reddit | [
"region:us"
] | 2024-02-16T10:27:50+00:00 | {"dataset_info": {"features": [{"name": "token_length", "dtype": "int64"}, {"name": "prompt_reddit_post", "dtype": "string"}, {"name": "category", "dtype": "null"}, {"name": "completion", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 22033138, "num_examples": 5000}], "download_size": 11894843, "dataset_size": 22033138}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-16T10:27:52+00:00 |
|
4643242542bfad53c61a18ee1c48a0557f5ce1e9 | Recag/Rp_CommonC_659_1 | [
"region:us"
] | 2024-02-16T10:28:35+00:00 | {} | 2024-02-16T10:40:21+00:00 |
|
0b6ffafee94931245abfd6f1c1c59efbd02f223f | Recag/Rp_CommonC_659_2 | [
"region:us"
] | 2024-02-16T10:28:40+00:00 | {} | 2024-02-16T10:37:24+00:00 |
|
5b378753752f9fdca131b65a1695f3a2b7403242 | Recag/Rp_CommonC_660_1 | [
"region:us"
] | 2024-02-16T10:29:32+00:00 | {} | 2024-02-16T10:41:06+00:00 |
|
dbac4a0a4cdec3e791ecf81c9d0bbba3570addc6 | Recag/Rp_CommonC_660_2 | [
"region:us"
] | 2024-02-16T10:29:38+00:00 | {} | 2024-02-16T10:39:13+00:00 |
|
01fac3846788c293b2c6b7d6a1fe6d4fca842a8e | Recag/Rp_CommonC_661_1 | [
"region:us"
] | 2024-02-16T10:31:11+00:00 | {} | 2024-02-16T10:41:48+00:00 |
|
3b5887c4a27727e89ac56b093e1d1c1678bdc249 | Recag/Rp_CommonC_661_2 | [
"region:us"
] | 2024-02-16T10:31:17+00:00 | {} | 2024-02-16T10:40:34+00:00 |
|
b0573a384f5539f36a70c2057a422c6037656db8 | Recag/Rp_CommonC_662_1 | [
"region:us"
] | 2024-02-16T10:31:32+00:00 | {} | 2024-02-16T10:41:57+00:00 |
|
9d79badaba74a6d3449ba44ca885499447d0d56a | Recag/Rp_CommonC_662_2 | [
"region:us"
] | 2024-02-16T10:32:29+00:00 | {} | 2024-02-16T10:40:35+00:00 |
|
6ca3a6b78978ea1b71f9bcf9e11cf6ead0e03215 | dz-data-ai/WEHAGO_TAX_ASSISTANT_VER3 | [
"region:us"
] | 2024-01-12T00:38:02+00:00 | {} | 2024-01-12T00:38:22+00:00 |
|
0cb004c22177dc2cd834fb2e68ce0289e4a280a4 | DanKamNdi/sheria-caselaw | [
"region:us"
] | 2024-01-12T00:43:59+00:00 | {} | 2024-01-12T00:45:34+00:00 |
|
46223e750c438d53e15f62b7066f931b0c8858e1 | jerryjalapeno/small_transcripts_alpaca | [
"region:us"
] | 2024-01-12T00:55:18+00:00 | {} | 2024-01-12T00:56:39+00:00 |
|
a14ef448ba53a271223831baa26ba925bda030c7 | wesley7137/dcoder_SFT | [
"region:us"
] | 2024-01-12T01:15:07+00:00 | {} | 2024-01-12T01:25:45+00:00 |
|
43d9ae87e2acc41418618254785b5cc6fe72f0b6 | Berzerker/ghega_ocr_dataset | [
"region:us"
] | 2024-01-12T01:33:06+00:00 | {} | 2024-01-12T01:33:06+00:00 |
|
75cde83f97761ed036b01f38936bc62f1fd67f5f | Berzerker/MSRA_TD50_ocr_dataset | [
"region:us"
] | 2024-01-12T01:44:22+00:00 | {} | 2024-01-12T01:44:22+00:00 |
|
8c26bcb4dad949dbf6715796e74720001e52e622 | iamnguyen/tdtu_event | [
"region:us"
] | 2024-01-12T02:00:51+00:00 | {} | 2024-01-12T02:01:12+00:00 |
|
c23c524a54a9d28bf9df20733061552ab98c7a66 | iamnguyen/tdtu-news | [
"region:us"
] | 2024-01-12T02:02:22+00:00 | {} | 2024-01-12T02:02:36+00:00 |
|
13d8369d9ce8e44cbcb5a6521f7c7a90fbc635f7 | zhanjun/mmichat | [
"region:us"
] | 2024-01-12T02:08:56+00:00 | {} | 2024-01-12T02:38:10+00:00 |
|
12903a02dd16321684581bc4add9cc17c35aac4d | Aligu1hao/lol_dataset | [
"region:us"
] | 2024-01-12T02:24:33+00:00 | {} | 2024-01-12T03:02:28+00:00 |
|
79fe2857e1f9fa8994659c66723e7c4d5f635eee | JUNIORFAUSTINO/jrteste | [
"region:us"
] | 2024-01-12T02:26:11+00:00 | {} | 2024-01-12T02:28:30+00:00 |
|
1832820bda02c5a075b3e28040d07ad498054ea7 | ZhihCheng/Motor_voice_dataset | [
"region:us"
] | 2024-01-12T02:31:31+00:00 | {} | 2024-01-12T02:38:44+00:00 |
|
6df7eb489f3be5e1a23533f1de89c231bb1bfe23 | bhargavi909/Medical_Transcriptions | [
"region:us"
] | 2024-01-12T02:40:04+00:00 | {} | 2024-01-12T02:41:38+00:00 |
|
cbd618be732c664864ec3f6bc23e5c9f5b8b279d | varcoder/test | [
"region:us"
] | 2024-01-12T02:47:59+00:00 | {} | 2024-01-12T02:47:59+00:00 |
|
bd0a1eb94a5d8f650eb667e0c1d96dbccdb4f4f1 | iamnguyen/tdtu-event | [
"region:us"
] | 2024-01-12T02:52:37+00:00 | {} | 2024-01-12T02:53:07+00:00 |
|
7602b72c6cc8be75cdaaf99b07f7d97bc63ecc04 | SiyiYao/m5C | [
"region:us"
] | 2024-01-12T02:59:46+00:00 | {} | 2024-01-12T03:00:53+00:00 |
|
028a503bca73fa34f13a075155504fbd03168214 | wesley7137/physicsSFT | [
"region:us"
] | 2024-01-12T03:10:09+00:00 | {} | 2024-01-12T03:10:31+00:00 |
|
b33666ca9ab3e51727be82a66a35b979318b8fac | jdy9591/jdy9591 | [
"region:us"
] | 2024-01-12T03:18:49+00:00 | {} | 2024-01-23T23:56:12+00:00 |
|
1bc957c2f0f925c183acab423b1396d5d233b7db | safgasgfsa/lightyagami | [
"region:us"
] | 2024-01-12T03:35:12+00:00 | {} | 2024-01-12T03:36:34+00:00 |
|
b3c28c3b2f44d1d3a4b7546013d0d68bcee62b2b | JcoolForever/voicemodels | [
"region:us"
] | 2024-01-12T03:38:40+00:00 | {} | 2024-01-17T03:00:16+00:00 |
|
c1a946ef609a9a7816e9ef85c7192ec1724a3064 | SumitMdhr/NEPALI-ASR | [
"region:us"
] | 2024-01-12T03:47:15+00:00 | {} | 2024-01-20T07:47:27+00:00 |
|
e1ef67a5632067619469002f7d8737e53d8cf3fb | umesh16071973/HRMS_preprocessed_data | [
"region:us"
] | 2024-01-12T04:38:41+00:00 | {} | 2024-01-18T09:23:04+00:00 |
|
4c7bd7f6b4be74653c87ba589d788b495743c22e | jerryjalapeno/jsonltest | [
"region:us"
] | 2024-01-12T05:03:20+00:00 | {} | 2024-01-12T05:05:31+00:00 |
|
fc729ddcd90915ba149d3c5b07aa513957fbfcdb | SrLamb/Lamb.PTBR.AI.Model | [
"region:us"
] | 2024-01-12T05:05:22+00:00 | {} | 2024-01-13T04:22:56+00:00 |
|
265f2df00ccc271aae064fd333b6aae2a03dac8d | Pokce/test | [
"region:us"
] | 2024-01-12T05:10:59+00:00 | {} | 2024-01-12T15:03:03+00:00 |
|
4319cf3df3ebc4e55a4cfb77cfe83ef99dbab0ae | ayrus08/TCGEL_AI_Dataset | [
"region:us"
] | 2024-01-12T05:50:58+00:00 | {} | 2024-01-12T05:52:04+00:00 |
|
f07d867855c9742c9091ab81a9f71b1453c925f8 | Akajuuh/testeee | [
"region:us"
] | 2024-01-12T06:03:06+00:00 | {} | 2024-01-12T06:17:59+00:00 |
|
6cbe0e5e533b1af5a6cad69faab7b974eb97dea8 | ayrus08/TEL_CHAT | [
"region:us"
] | 2024-01-12T06:15:15+00:00 | {} | 2024-01-12T06:15:15+00:00 |
|
2e1882479699d12be32a867e10a84b57252c5395 | ByunByun/Positive_data | [
"region:us"
] | 2024-01-12T06:35:29+00:00 | {} | 2024-01-12T06:36:06+00:00 |
|
7205464e0b4b42940d4c4bf006d6a603179201e6 | picas9dan/ontokin_2024-01-12_14.16.36 | [
"region:us"
] | 2024-01-12T06:39:51+00:00 | {} | 2024-01-12T08:02:38+00:00 |
|
e2da17d2155c8be8794d24982e8e88caf1b96917 | hduc-le/dvf | [
"region:us"
] | 2024-01-12T07:01:12+00:00 | {} | 2024-01-12T07:02:39+00:00 |
|
b40234f75c07739fa1b9c9edb727859810b7558e | tailer809/test2 | [
"region:us"
] | 2024-01-12T07:40:46+00:00 | {} | 2024-01-12T07:41:24+00:00 |
|
9875a7a1201bf2330ccb6aa369262676e2f26d4e | ByunByun/key_words_positive_title | [
"region:us"
] | 2024-01-12T07:43:56+00:00 | {} | 2024-01-12T07:45:46+00:00 |
|
c6f3838d6adbfd7d928e8660502efde75d50bb60 | picas9dan/chemistry_2024-01-12_16.22.36 | [
"region:us"
] | 2024-01-12T08:23:37+00:00 | {} | 2024-01-12T08:26:29+00:00 |
|
c0d070b7caf11df5ea51368c9a3f0da499dd193f | bunnikyu/yesnod | [
"region:us"
] | 2024-01-12T08:27:48+00:00 | {} | 2024-01-12T08:27:49+00:00 |
|
1ead300274854a8f363d9e6906cf42933e3c2070 | malaysia-ai/mosaic-litellama | [
"region:us"
] | 2024-01-12T08:36:42+00:00 | {} | 2024-01-12T12:54:14+00:00 |
|
ec85e95d966b09cae12c92ee19f2ade9b2042cfb | SoorajK1/sentiment_analysis_dataset-9846258f-16b8-422d-a291-965686769dca | [
"region:us"
] | 2024-01-12T09:02:23+00:00 | {} | 2024-01-12T09:02:29+00:00 |
|
37ef438f94bdf1f8c3e4b6e0eb6ed8fe81b3bcbf | Toannn/f | [
"region:us"
] | 2024-01-12T09:17:09+00:00 | {} | 2024-01-12T09:17:12+00:00 |
Subsets and Splits
No saved queries yet
Save your SQL queries to embed, download, and access them later. Queries will appear here once saved.