Update README.md
Browse files
README.md
CHANGED
|
@@ -165,4 +165,17 @@ For details, we refer to the paper and to our benchmark [ChocoLlama-Bench](https
|
|
| 165 |
|
| 166 |
### Compute Infrastructure
|
| 167 |
|
| 168 |
-
All ChocoLlama models have been trained on the compute cluster provided by the [Flemish Supercomputer Center (VSC)](https://www.vscentrum.be/). We used 8 to 16 NVIDIA A100 GPU's with 80 GB of VRAM.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 165 |
|
| 166 |
### Compute Infrastructure
|
| 167 |
|
| 168 |
+
All ChocoLlama models have been trained on the compute cluster provided by the [Flemish Supercomputer Center (VSC)](https://www.vscentrum.be/). We used 8 to 16 NVIDIA A100 GPU's with 80 GB of VRAM.
|
| 169 |
+
|
| 170 |
+
## Citation
|
| 171 |
+
|
| 172 |
+
If you found this useful for your work, kindly cite our paper:
|
| 173 |
+
|
| 174 |
+
```
|
| 175 |
+
@article{meeus2024chocollama,
|
| 176 |
+
title={ChocoLlama: Lessons Learned From Teaching Llamas Dutch},
|
| 177 |
+
author={Meeus, Matthieu and Rath{\'e}, Anthony and Remy, Fran{\c{c}}ois and Delobelle, Pieter and Decorte, Jens-Joris and Demeester, Thomas},
|
| 178 |
+
journal={arXiv preprint arXiv:2412.07633},
|
| 179 |
+
year={2024}
|
| 180 |
+
}
|
| 181 |
+
```
|