title: Mixtral 8x22B | |
emoji: π | |
colorFrom: yellow | |
colorTo: blue | |
sdk: static | |
pinned: false | |
license: mit | |
short_description: Mixtral-8x22B LLM is a pre-trained generative model. | |
Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference |