Tell me how do you feel about this model without telling me how do you feel about this model

#5
by MrDevolver - opened

I'll start...

img

111B is quite a good size, actually. It possible to run a good quant with decent context length on just four 24GB consumer-grade GPUs. And it is even a bit smaller than Mistral Large 123B and Pixtral Large 124B, which I use daily.

111B is quite a good size, actually. It possible to run a good quant with decent context length on just four 24GB consumer-grade GPUs. And it is even a bit smaller than Mistral Large 123B and Pixtral Large 124B, which I use daily.

Haha, yeah let me just grab the spare four 24GB consumer-grade GPUs from my garage, they would collect the dust otherwise... 🤣

Your need to confirm your account before you can post a new comment.

Sign up or log in to comment