Update README.md (#2)
Browse files- Update README.md (67438563f97825184d0fe894f6d291c0567bb108)
README.md
CHANGED
@@ -23,7 +23,7 @@ quantized_by: MaziyarPanahi
|
|
23 |
|
24 |
## Model Description
|
25 |
|
26 |
-
Calme-
|
27 |
|
28 |
### How to Use
|
29 |
|
|
|
23 |
|
24 |
## Model Description
|
25 |
|
26 |
+
Calme-4x7B is a Mixture of Experts (MoE) model, integrating four state-of-the-art Calme-7B models. Essentially, Calme-4x7B is composed of four Calme-7B models that have been individually fine-tuned, featuring two experts per token. This configuration brings the total to over 24 billion parameters. Calme-4x7B models are distinguished by their ability to generate text with exceptional clarity, calmness, and coherence.
|
27 |
|
28 |
### How to Use
|
29 |
|