efederici commited on
Commit
ccb3ac7
·
1 Parent(s): cad1a82

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -0
README.md CHANGED
@@ -11,6 +11,9 @@ tags:
11
 
12
  IPT-125m is a decoder-style transformer pretrained from scratch on 4.36 billion tokens of Italian text from the [OSCAR-2301](https://huggingface.co/datasets/oscar-corpus/OSCAR-2301) dataset.
13
 
 
 
 
14
  ## How to Use
15
 
16
  This model is best used with the Hugging Face `transformers` library for training and finetuning.
 
11
 
12
  IPT-125m is a decoder-style transformer pretrained from scratch on 4.36 billion tokens of Italian text from the [OSCAR-2301](https://huggingface.co/datasets/oscar-corpus/OSCAR-2301) dataset.
13
 
14
+ If you like this project, consider supporting it with a cup of coffee! 🤖✨🌞
15
+ [![Buy me a coffee](https://badgen.net/badge/icon/Buy%20Me%20A%20Coffee?icon=buymeacoffee&label)](https://bmc.link/edoardofederici)
16
+
17
  ## How to Use
18
 
19
  This model is best used with the Hugging Face `transformers` library for training and finetuning.