Minimum requirements

#18
by Julen10 - opened

What are the minimum hardware requirements for deploying this model on vllm?

MiniMax org

8 x 90GB GPU

8 x 90GB GPU

Thanks!!

Is there any site where I can found the different models info for deploying it?
For example for checking the https://huggingface.co/MiniMaxAI/MiniMax-M1-40k how much it needs

MiniMax org

8 x 90GB GPU

Thanks!!

Is there any site where I can found the different models info for deploying it?
For example for checking the https://huggingface.co/MiniMaxAI/MiniMax-M1-40k how much it needs

The MiniMax-M1 models all share the same architecture, so the minimum recommended configuration is 8 GPUs with 90GB of memory each. If you encounter GPU memory issues, you can try switching to int8 or apply further quantization to reduce memory usage.

Sign up or log in to comment