metadata
license: apache-2.0
tags:
- moe
- frankenmoe
- merge
- mergekit
- Himitsui/Kaiju-11B
- Sao10K/Fimbulvetr-11B-v2
- decapoda-research/Antares-11b-v2
- beberik/Nyxene-v3-11B
base_model:
- Himitsui/Kaiju-11B
- Sao10K/Fimbulvetr-11B-v2
- decapoda-research/Antares-11b-v2
- beberik/Nyxene-v3-11B
Umbra-v3-MoE-4x11b

Creator: SteelSkull
About Umbra-v3-MoE-4x11b: A Mixture of Experts model designed for general assistance with a special knack for storytelling and RP/ERP, built using LazyMergekit.
Integrates models from notable sources for enhanced performance in diverse tasks.
Source Models:
Configuration Highlights: Features a carefully curated mix of positive and negative prompts tailored to leverage the unique strengths of each model, enhancing its general AI capabilities while maintaining prowess in creative storytelling.
Usage Instructions: Provided in the form of Python code snippets, making it easy for developers to integrate Umbra-v3 into their projects for advanced text generation tasks.