How to use inference on GPU

#13
by prajwalJumde - opened

inference not working on gpu, I have already installed CUDA
python3 -c "import tensorflow as tf; print(tf.config.list_physical_devices('GPU'))"
[PhysicalDevice(name='/physical_device:GPU:0', device_type='GPU')]

Hmm it seems like for some reason you are using tensorflow? You have to use llama cpp or anything that uses llama cpp like text generation web ui or ctransformers?

Your need to confirm your account before you can post a new comment.

Sign up or log in to comment