Update README.md
Browse files
README.md
CHANGED
@@ -41,12 +41,16 @@ In addition, Doge uses Dynamic Mask Attention as sequence transformation and can
|
|
41 |
| Model | Training Data | Steps | Content Length | Tokens | LR | Batch Size | Precision |
|
42 |
|---|---|---|---|---|---|---|---|
|
43 |
| [Doge-20M](https://huggingface.co/JingzeShi/Doge-20M) | [HuggingFaceTB/smollm-corpus](https://huggingface.co/datasets/HuggingFaceTB/smollm-corpus) | 8k | 2048 | 4B | 8e-3 | 0.5M | bfloat16 |
|
|
|
44 |
|
45 |
**Evaluation**:
|
46 |
|
47 |
| Model | MMLU | TriviaQA | ARC-E | ARC-C | PIQA | HellaSwag | OBQA | Winogrande |
|
48 |
|---|---|---|---|---|---|---|---|---|
|
49 |
| [Doge-20M](https://huggingface.co/JingzeShi/Doge-20M) | 25.43 | 0 | 36.83 | 22.53 | 58.38 | 27.25 | 25.60 | 50.20 |
|
|
|
|
|
|
|
50 |
|
51 |
**Procedure**:
|
52 |
|
|
|
41 |
| Model | Training Data | Steps | Content Length | Tokens | LR | Batch Size | Precision |
|
42 |
|---|---|---|---|---|---|---|---|
|
43 |
| [Doge-20M](https://huggingface.co/JingzeShi/Doge-20M) | [HuggingFaceTB/smollm-corpus](https://huggingface.co/datasets/HuggingFaceTB/smollm-corpus) | 8k | 2048 | 4B | 8e-3 | 0.5M | bfloat16 |
|
44 |
+
| [Doge-60M](https://huggingface.co/JingzeShi/Doge-60M) | [HuggingFaceTB/smollm-corpus](https://huggingface.co/datasets/HuggingFaceTB/smollm-corpus) | 16k | 2048 | 16B | 6e-3 | 1M | bfloat16 |
|
45 |
|
46 |
**Evaluation**:
|
47 |
|
48 |
| Model | MMLU | TriviaQA | ARC-E | ARC-C | PIQA | HellaSwag | OBQA | Winogrande |
|
49 |
|---|---|---|---|---|---|---|---|---|
|
50 |
| [Doge-20M](https://huggingface.co/JingzeShi/Doge-20M) | 25.43 | 0 | 36.83 | 22.53 | 58.38 | 27.25 | 25.60 | 50.20 |
|
51 |
+
| [Doge-60M](https://huggingface.co/JingzeShi/Doge-60M) | 26.41 | 0 | 50.00 | 25.34 | 61.43 | 31.45 | 28.00 | 49.64 |
|
52 |
+
|
53 |
+
> All evaluations are done using five-shot settings, without additional training on the benchmarks.
|
54 |
|
55 |
**Procedure**:
|
56 |
|