feat: Add sponsorship and website section
Browse files
README.md
CHANGED
@@ -53,6 +53,16 @@ data following a similar collection procedure in Hu et al. (2021a).
|
|
53 |
|
54 |
This checkpoint is "GIT-large", which is a smaller variant of GIT trained on 20 million image-text pairs.
|
55 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
56 |
Next, the model was fine-tuned on TextCaps.
|
57 |
|
58 |
See table 11 in the [paper](https://arxiv.org/abs/2205.14100) for more details.
|
|
|
53 |
|
54 |
This checkpoint is "GIT-large", which is a smaller variant of GIT trained on 20 million image-text pairs.
|
55 |
|
56 |
+
---
|
57 |
+
### 🌐 Website
|
58 |
+
You can find more of my models, projects, and information on my official website:
|
59 |
+
- **[artificialguy.com](https://artificialguy.com/)**
|
60 |
+
|
61 |
+
### 💖 Support My Work
|
62 |
+
If you find this model useful, please consider supporting my work. It helps me cover server costs and dedicate more time to new open-source projects.
|
63 |
+
- **Patreon:** [Support on Patreon](https://www.patreon.com/user?u=81570187)
|
64 |
+
- **Ko-fi:** [Buy me a Ko-fi](https://ko-fi.com/artificialguybr)
|
65 |
+
- **Buy Me a Coffee:** [Buy me a Coffee](https://buymeacoffee.com/jvkape)
|
66 |
Next, the model was fine-tuned on TextCaps.
|
67 |
|
68 |
See table 11 in the [paper](https://arxiv.org/abs/2205.14100) for more details.
|