Kquant03 commited on
Commit
49431d8
·
1 Parent(s): 4dbfe54

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -11,7 +11,7 @@ tags:
11
  # A model for ERP, engineered to bring you the most desirable experience.
12
 
13
  I finally figured out how to quantize FrankenMoE properly, so prepare for a flood of GGUF models from me. This one is scripted to be into whatever you're planning to do to it.
14
-
15
  # "[What is a Mixture of Experts (MoE)?](https://huggingface.co/blog/moe)"
16
  ### (from the MistralAI papers...click the quoted question above to navigate to it directly.)
17
 
 
11
  # A model for ERP, engineered to bring you the most desirable experience.
12
 
13
  I finally figured out how to quantize FrankenMoE properly, so prepare for a flood of GGUF models from me. This one is scripted to be into whatever you're planning to do to it.
14
+ Special thanks to [Cultrix](https://huggingface.co/CultriX) for the [base model](https://huggingface.co/CultriX/MistralTrix-v1).
15
  # "[What is a Mixture of Experts (MoE)?](https://huggingface.co/blog/moe)"
16
  ### (from the MistralAI papers...click the quoted question above to navigate to it directly.)
17