Update README.md
Browse files
README.md
CHANGED
@@ -18,7 +18,7 @@ I was approached with the idea to make a merge based on story telling, and consi
|
|
18 |
|
19 |
We believe that this, while it might not be better logically than mixtral base instruct, is definitely more creative. Special thanks to [NeuralNovel](https://huggingface.co/NeuralNovel) for collaborating with me on this project
|
20 |
|
21 |
-

|
23 |
It performs better than base mixtral 8x across many evaluations. It's half the size and is comparable to most MoEs. Thanks so much to HuggingFace for evaluating it!
|
24 |
# "[What is a Mixture of Experts (MoE)?](https://huggingface.co/blog/moe)"
|
|
|
18 |
|
19 |
We believe that this, while it might not be better logically than mixtral base instruct, is definitely more creative. Special thanks to [NeuralNovel](https://huggingface.co/NeuralNovel) for collaborating with me on this project
|
20 |
|
21 |
+

|
22 |

|
23 |
It performs better than base mixtral 8x across many evaluations. It's half the size and is comparable to most MoEs. Thanks so much to HuggingFace for evaluating it!
|
24 |
# "[What is a Mixture of Experts (MoE)?](https://huggingface.co/blog/moe)"
|