File size: 1,599 Bytes
c414a90 adb4054 c414a90 adb4054 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 |
---
license: apache-2.0
tags:
- moe
- frankenmoe
- merge
- mergekit
- lazymergekit
- openaccess-ai-collective/tiny-mistral
base_model:
- openaccess-ai-collective/tiny-mistral
- openaccess-ai-collective/tiny-mistral
- openaccess-ai-collective/tiny-mistral
- openaccess-ai-collective/tiny-mistral
---
# test_tiny_mixtral_only_router
test_tiny_mixtral_only_router is a Mixure of Experts (MoE) made with the following models using a modified version of mergekit.
* [openaccess-ai-collective/tiny-mistral](https://huggingface.co/openaccess-ai-collective/tiny-mistral)
* [openaccess-ai-collective/tiny-mistral](https://huggingface.co/openaccess-ai-collective/tiny-mistral)
* [openaccess-ai-collective/tiny-mistral](https://huggingface.co/openaccess-ai-collective/tiny-mistral)
* [openaccess-ai-collective/tiny-mistral](https://huggingface.co/openaccess-ai-collective/tiny-mistral)
## 🧩 Configuration
```yaml
base_model: openaccess-ai-collective/tiny-mistral
gate_mode: hidden
dtype: bfloat16
experts:
- source_model: openaccess-ai-collective/tiny-mistral
positive_prompts:
- "math"
# You can add negative_prompts if needed
- source_model: openaccess-ai-collective/tiny-mistral
positive_prompts:
- "science"
- source_model: openaccess-ai-collective/tiny-mistral
positive_prompts:
- "writing"
# You can add negative_prompts if needed
- source_model: openaccess-ai-collective/tiny-mistral
positive_prompts:
- "general"
```
This is a test version of arcee-ai's hidden state model. It is a router for a frankenMoE instead of the entire MoE itself |