Update README.md
Browse files
README.md
CHANGED
@@ -161,7 +161,7 @@ outputs = model.generate(
|
|
161 |
|
162 |
response_list = [tokenizer.decode(output).split(prompt)[1] for output in outputs]
|
163 |
```
|
164 |
-
Note: The provided chat template helps generate the best response by structuring conversations optimally for the model.
|
165 |
|
166 |
## Limitations
|
167 |
The model was trained on a dataset that includes content from the internet, which may contain toxic language, biases, and unsafe content. As a result, the model may:
|
|
|
161 |
|
162 |
response_list = [tokenizer.decode(output).split(prompt)[1] for output in outputs]
|
163 |
```
|
164 |
+
Note: The provided chat template, which is the default chat template, helps generate the best response by structuring conversations optimally for the model.
|
165 |
|
166 |
## Limitations
|
167 |
The model was trained on a dataset that includes content from the internet, which may contain toxic language, biases, and unsafe content. As a result, the model may:
|