Inference with vLLM not working

#2
by llamameta - opened

OSError: openbmb/MiniCPM-V-2_6-gguf does not appear to have a file named

config.json. Checkout

'https://huggingface.co/openbmb/MiniCPM-V-2_6-gguf/tree/main' for available

files.

llamameta changed discussion title from Inference with vLLM to Inference with vLLM not working
Your need to confirm your account before you can post a new comment.

Sign up or log in to comment