Update README.md
Browse files
README.md
CHANGED
@@ -3,7 +3,7 @@ license: mit
|
|
3 |
language:
|
4 |
- en
|
5 |
---
|
6 |
-
The smallest GPT-2 finetuned on approximately 2.23B tokens consisting of 1.3B from common crawl sites from 2023, 540M from ArXiv, and 390M from GitHub.
|
7 |
|
8 |
*(from GPT-2 model card)*
|
9 |
|
|
|
3 |
language:
|
4 |
- en
|
5 |
---
|
6 |
+
The smallest GPT-2 finetuned on approximately 2.23B tokens (almost the 24.8B needed to 'chinchilla-optimally' pretrain it!) consisting of 1.3B from common crawl sites from 2023, 540M from ArXiv, and 390M from GitHub.
|
7 |
|
8 |
*(from GPT-2 model card)*
|
9 |
|