MaziyarPanahi commited on
Commit
a227995
·
verified ·
1 Parent(s): 54c445d

Update README.md (#2)

Browse files

- Update README.md (67438563f97825184d0fe894f6d291c0567bb108)

Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -23,7 +23,7 @@ quantized_by: MaziyarPanahi
23
 
24
  ## Model Description
25
 
26
- Calme-7B is a state-of-the-art language model with 24 billion parameters, fine-tuned over high-quality datasets on top of Mistral-7B. The Calme-4x7B models excel in generating text that resonates with clarity, calmness, and coherence.
27
 
28
  ### How to Use
29
 
 
23
 
24
  ## Model Description
25
 
26
+ Calme-4x7B is a Mixture of Experts (MoE) model, integrating four state-of-the-art Calme-7B models. Essentially, Calme-4x7B is composed of four Calme-7B models that have been individually fine-tuned, featuring two experts per token. This configuration brings the total to over 24 billion parameters. Calme-4x7B models are distinguished by their ability to generate text with exceptional clarity, calmness, and coherence.
27
 
28
  ### How to Use
29