Update README.md
Browse files
README.md
CHANGED
@@ -18,7 +18,7 @@ datasets:
|
|
18 |
|
19 |
## Release Documentation
|
20 |
|
21 |
-
OLMoE-1B-7B-0125-DPO January 2025 is post-trained variant of the [OLMoE-1B-7B January 2025](https://huggingface.co/allenai/OLMoE-1B-7B-0125) model, which has undergone supervised finetuning on an OLMo-specific variant of the [Tülu 3 dataset](allenai/tulu-3-sft-olmo-2-mixture) and further DPO training on [this dataset](https://huggingface.co/datasets/allenai/
|
22 |
Tülu 3 is designed for state-of-the-art performance on a diversity of tasks in addition to chat, such as MATH, GSM8K, and IFEval.
|
23 |
Check out the [OLMoE paper](https://arxiv.org/abs/2409.02060) or [Tülu 3 paper](https://arxiv.org/abs/2411.15124) for more details!
|
24 |
|
@@ -27,7 +27,7 @@ These models are trained on the Dolma dataset. We are releasing all code, checkp
|
|
27 |
The core models released in this batch include the following:
|
28 |
|
29 |
|
30 |
-
| **Stage** | **
|
31 |
|----------------------|----------------------------------------------------------------------------------------------------------|
|
32 |
| **Base Model** | [allenai/OLMoE-1B-7B-0125](https://huggingface.co/allenai/OLMoE-1B-7B-0125) |
|
33 |
| **SFT** | [allenai/OLMoE-1B-7B-0125-SFT](https://huggingface.co/allenai/OLMoE-1B-7B-0125-SFT) |
|
|
|
18 |
|
19 |
## Release Documentation
|
20 |
|
21 |
+
OLMoE-1B-7B-0125-DPO January 2025 is post-trained variant of the [OLMoE-1B-7B January 2025](https://huggingface.co/allenai/OLMoE-1B-7B-0125) model, which has undergone supervised finetuning on an OLMo-specific variant of the [Tülu 3 dataset](allenai/tulu-3-sft-olmo-2-mixture) and further DPO training on [this dataset](https://huggingface.co/datasets/allenai/olmoe-0125-1b-7b-preference-mix).
|
22 |
Tülu 3 is designed for state-of-the-art performance on a diversity of tasks in addition to chat, such as MATH, GSM8K, and IFEval.
|
23 |
Check out the [OLMoE paper](https://arxiv.org/abs/2409.02060) or [Tülu 3 paper](https://arxiv.org/abs/2411.15124) for more details!
|
24 |
|
|
|
27 |
The core models released in this batch include the following:
|
28 |
|
29 |
|
30 |
+
| **Stage** | **OLMoE 1B-7B** |
|
31 |
|----------------------|----------------------------------------------------------------------------------------------------------|
|
32 |
| **Base Model** | [allenai/OLMoE-1B-7B-0125](https://huggingface.co/allenai/OLMoE-1B-7B-0125) |
|
33 |
| **SFT** | [allenai/OLMoE-1B-7B-0125-SFT](https://huggingface.co/allenai/OLMoE-1B-7B-0125-SFT) |
|