Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
SpongeEngine
/
Rombo-LLM-V3.0-Qwen-72b-i1-GGUF
like
0
Follow
Sponge Engine
3
GGUF
English
SpongeQuant
i1-GGUF
Inference Endpoints
imatrix
conversational
License:
mit
Model card
Files
Files and versions
Community
Deploy
Use this model
main
Rombo-LLM-V3.0-Qwen-72b-i1-GGUF
1 contributor
History:
48 commits
dclipca
Upload folder using huggingface_hub
991283c
verified
1 day ago
.gitattributes
Safe
3.3 kB
Upload folder using huggingface_hub
1 day ago
README.md
Safe
2.56 kB
Upload folder using huggingface_hub
1 day ago
Rombo-LLM-V3.0-Qwen-72b.imatrix.dat
25.2 MB
LFS
Upload folder using huggingface_hub
3 days ago
rombo-llm-v3.0-qwen-72b-i1-IQ1_M.gguf
Safe
23.7 GB
LFS
Upload folder using huggingface_hub
3 days ago
rombo-llm-v3.0-qwen-72b-i1-IQ2_M.gguf
Safe
29.3 GB
LFS
Upload folder using huggingface_hub
3 days ago
rombo-llm-v3.0-qwen-72b-i1-IQ2_S.gguf
Safe
27.9 GB
LFS
Upload folder using huggingface_hub
3 days ago
rombo-llm-v3.0-qwen-72b-i1-IQ2_XS.gguf
Safe
27.1 GB
LFS
Upload folder using huggingface_hub
3 days ago
rombo-llm-v3.0-qwen-72b-i1-IQ2_XXS.gguf
Safe
25.5 GB
LFS
Upload folder using huggingface_hub
3 days ago
rombo-llm-v3.0-qwen-72b-i1-IQ3_M.gguf
Safe
35.5 GB
LFS
Upload folder using huggingface_hub
1 day ago
rombo-llm-v3.0-qwen-72b-i1-IQ3_S.gguf
Safe
34.5 GB
LFS
Upload folder using huggingface_hub
2 days ago
rombo-llm-v3.0-qwen-72b-i1-IQ3_XS.gguf
Safe
32.8 GB
LFS
Upload folder using huggingface_hub
3 days ago
rombo-llm-v3.0-qwen-72b-i1-IQ3_XXS.gguf
Safe
31.8 GB
LFS
Upload folder using huggingface_hub
3 days ago
rombo-llm-v3.0-qwen-72b-i1-IQ4_NL.gguf
Safe
41.3 GB
LFS
Upload folder using huggingface_hub
1 day ago
rombo-llm-v3.0-qwen-72b-i1-IQ4_XS.gguf
Safe
39.7 GB
LFS
Upload folder using huggingface_hub
1 day ago
rombo-llm-v3.0-qwen-72b-i1-Q2_K.gguf
Safe
29.8 GB
LFS
Upload folder using huggingface_hub
1 day ago
rombo-llm-v3.0-qwen-72b-i1-Q2_K_S.gguf
Safe
29.6 GB
LFS
Upload folder using huggingface_hub
1 day ago
rombo-llm-v3.0-qwen-72b-i1-Q3_K_L.gguf
Safe
39.5 GB
LFS
Upload folder using huggingface_hub
1 day ago
rombo-llm-v3.0-qwen-72b-i1-Q3_K_M.gguf
Safe
37.7 GB
LFS
Upload folder using huggingface_hub
1 day ago
rombo-llm-v3.0-qwen-72b-i1-Q3_K_S.gguf
Safe
34.5 GB
LFS
Upload folder using huggingface_hub
1 day ago
rombo-llm-v3.0-qwen-72b-i1-Q4_0.gguf
Safe
41.4 GB
LFS
Upload folder using huggingface_hub
1 day ago
rombo-llm-v3.0-qwen-72b-i1-Q4_1.gguf
Safe
45.7 GB
LFS
Upload folder using huggingface_hub
1 day ago
rombo-llm-v3.0-qwen-72b-i1-Q4_K_M.gguf
Safe
47.4 GB
LFS
Upload folder using huggingface_hub
1 day ago
rombo-llm-v3.0-qwen-72b-i1-Q4_K_S.gguf
Safe
43.9 GB
LFS
Upload folder using huggingface_hub
1 day ago
rombo-llm-v3.0-qwen-72b-i1-TQ1_0.gguf
23.4 GB
LFS
Upload folder using huggingface_hub
3 days ago
rombo-llm-v3.0-qwen-72b-i1-TQ2_0.gguf
25.7 GB
LFS
Upload folder using huggingface_hub
3 days ago
upload_success.txt
Safe
18 Bytes
Upload folder using huggingface_hub
3 days ago