Update README.md
Browse files
README.md
CHANGED
@@ -13,7 +13,7 @@ For simplified binary moderation tasks, the model can be used to produce a singl
|
|
13 |
|
14 |
DuoGuard-1B-Llama-3.2-transfer is built upon Llama-3.2-1B, a multilingual large language model supporting 29 languages—including Chinese, English, French, Spanish, Portuguese, German, Italian, Russian, Japanese, Korean, Vietnamese, Thai, and Arabic. We directly leverage the training data developed fro DuoGuard-0.5B to train Llama-3.2-1B and obtain DuoGuard-1.5B-transfer. Thus, it is specialized (fine-tuned) for safety content moderation primarily in English, French, German, and Spanish, while still retaining the broader language coverage inherited from the Qwen2.5 base model. It is provided with open weights.
|
15 |
## How to Use
|
16 |
-
A quick code snippet or set of instructions on how to load and use
|
17 |
```python
|
18 |
from transformers import AutoTokenizer, AutoModelForSequenceClassification
|
19 |
import torch
|
|
|
13 |
|
14 |
DuoGuard-1B-Llama-3.2-transfer is built upon Llama-3.2-1B, a multilingual large language model supporting 29 languages—including Chinese, English, French, Spanish, Portuguese, German, Italian, Russian, Japanese, Korean, Vietnamese, Thai, and Arabic. We directly leverage the training data developed fro DuoGuard-0.5B to train Llama-3.2-1B and obtain DuoGuard-1.5B-transfer. Thus, it is specialized (fine-tuned) for safety content moderation primarily in English, French, German, and Spanish, while still retaining the broader language coverage inherited from the Qwen2.5 base model. It is provided with open weights.
|
15 |
## How to Use
|
16 |
+
A quick code snippet or set of instructions on how to load and use the model in an application:
|
17 |
```python
|
18 |
from transformers import AutoTokenizer, AutoModelForSequenceClassification
|
19 |
import torch
|