Update README.md
Browse files
README.md
CHANGED
@@ -57,6 +57,7 @@ I use max_seq_len 8K with alpha_value 2.65.
|
|
57 |
- [3bpw](https://huggingface.co/altomek/CodeRosa-70B-AB1-3bpw-EXL2)
|
58 |
- [2.4bpw](https://huggingface.co/altomek/CodeRosa-70B-AB1-2.4bpw-EXL2) --> 24GB VRAM
|
59 |
- [measurements](https://huggingface.co/altomek/measurements/resolve/main/CodeRosa-AB1_measurement.json) --> ExLlamav2 measurments
|
|
|
60 |
|
61 |
### PS
|
62 |
I welcome your comments about this model.
|
|
|
57 |
- [3bpw](https://huggingface.co/altomek/CodeRosa-70B-AB1-3bpw-EXL2)
|
58 |
- [2.4bpw](https://huggingface.co/altomek/CodeRosa-70B-AB1-2.4bpw-EXL2) --> 24GB VRAM
|
59 |
- [measurements](https://huggingface.co/altomek/measurements/resolve/main/CodeRosa-AB1_measurement.json) --> ExLlamav2 measurments
|
60 |
+
- [GGUF](https://huggingface.co/mradermacher/CodeRosa-70B-AB1-GGUF)
|
61 |
|
62 |
### PS
|
63 |
I welcome your comments about this model.
|