Update README.md
Browse files
README.md
CHANGED
@@ -21,7 +21,7 @@ Merging seems like the way to go when it comes to training language models on a
|
|
21 |
|
22 |
This model was created by training LoRAs and Della merge them. It saves space and time this way and the result is good. The WhiteRabbitNeo datasets are the focus in this one along with coding.
|
23 |
|
24 |
-
Incedentally, it seems uncensored.
|
25 |
|
26 |
## Apache-2.0 + WhiteRabbitNeo Extended Version
|
27 |
|
|
|
21 |
|
22 |
This model was created by training LoRAs and Della merge them. It saves space and time this way and the result is good. The WhiteRabbitNeo datasets are the focus in this one along with coding.
|
23 |
|
24 |
+
Incedentally, it seems uncensored. It was trained using the ChatML template and can be used with or without a system prompt.
|
25 |
|
26 |
## Apache-2.0 + WhiteRabbitNeo Extended Version
|
27 |
|