Spaces:
Running
on
A10G
Running
on
A10G
Fix: Disable LLAMA_CURL in Hugging Face space environment
#168
by
Gapeleon
- opened
This PR addresses an issue with the recent change in llama.cpp that sets LLAMA_CURL=ON by default.
The default setting breaks the Hugging Face space environment for gguf-my-repo because libcurl dependencies are not available in the HF space runtime.
This fix explicitly sets LLAMA_CURL=OFF when running in the Hugging Face space environment while maintaining the ability to use CURL when running locally.
Changes:
- Added LLAMA_CURL environment variable control similar to the existing GGML_CUDA control
- Set LLAMA_CURL=OFF by default for safety
- Only enable LLAMA_CURL when NOT running locally
Tested by deploying a duplicate space with these changes applied, which successfully resolves the build failure.