Update README.md
Browse files
README.md
CHANGED
@@ -11,6 +11,9 @@ tags:
|
|
11 |
|
12 |
IPT-125m is a decoder-style transformer pretrained from scratch on 4.36 billion tokens of Italian text from the [OSCAR-2301](https://huggingface.co/datasets/oscar-corpus/OSCAR-2301) dataset.
|
13 |
|
|
|
|
|
|
|
14 |
## How to Use
|
15 |
|
16 |
This model is best used with the Hugging Face `transformers` library for training and finetuning.
|
|
|
11 |
|
12 |
IPT-125m is a decoder-style transformer pretrained from scratch on 4.36 billion tokens of Italian text from the [OSCAR-2301](https://huggingface.co/datasets/oscar-corpus/OSCAR-2301) dataset.
|
13 |
|
14 |
+
If you like this project, consider supporting it with a cup of coffee! 🤖✨🌞
|
15 |
+
[](https://bmc.link/edoardofederici)
|
16 |
+
|
17 |
## How to Use
|
18 |
|
19 |
This model is best used with the Hugging Face `transformers` library for training and finetuning.
|