Is there any way to run this on 16-24 GB VRAM GPUs?

#46
by dominic1021 - opened

Hello!
I tried to run this app on Google Colab (Tesla T4 16 GB VRAM GPU) and I got out of VRAM error.
Is there any way to run this on GPUs with 16-24 GB VRAM? Even if it means to use a lower resolution and everything, I'm fine with that.

As far I found a site which gives A100s for free, but Idk how to configure Trellis/Hunyuan3D-2 on it.
Maybe you have better luck: https://modal.com/
It gives $30 free credits per month to use any GPU you want, but everything is done through a terminal command with "modal" or something and I don't really know how to make trellis work on it...

Sign up or log in to comment