Update README.md
Browse files
README.md
CHANGED
@@ -16,7 +16,7 @@ This Mixture-of-Experts model is the combination of the following:
|
|
16 |
|
17 |
It is created using the following `mergekit-moe` config:
|
18 |
|
19 |
-
```
|
20 |
base_model: qwen_chat
|
21 |
gate_mode: hidden
|
22 |
dtype: bfloat16
|
|
|
16 |
|
17 |
It is created using the following `mergekit-moe` config:
|
18 |
|
19 |
+
```yaml
|
20 |
base_model: qwen_chat
|
21 |
gate_mode: hidden
|
22 |
dtype: bfloat16
|