Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
QuantFactory
/
codegeex4-all-9b-GGUF
like
3
Follow
Quant Factory
551
Text Generation
GGUF
Chinese
English
glm
codegeex
thudm
License:
codegeex4
Model card
Files
Files and versions
xet
Community
1
Use this model
1464a0f
codegeex4-all-9b-GGUF
Ctrl+K
Ctrl+K
2 contributors
History:
19 commits
aashish1904
Upload codegeex4-all-9b.Q3_K_S.gguf with huggingface_hub
1464a0f
verified
about 1 year ago
.gitattributes
2.35 kB
Upload codegeex4-all-9b.Q3_K_S.gguf with huggingface_hub
about 1 year ago
codegeex4-all-9b.Q2_K.gguf
Safe
3.99 GB
xet
Upload codegeex4-all-9b.Q2_K.gguf with huggingface_hub
about 1 year ago
codegeex4-all-9b.Q3_K_L.gguf
Safe
5.28 GB
xet
Upload codegeex4-all-9b.Q3_K_L.gguf with huggingface_hub
about 1 year ago
codegeex4-all-9b.Q3_K_M.gguf
Safe
5.06 GB
xet
Upload codegeex4-all-9b.Q3_K_M.gguf with huggingface_hub
about 1 year ago
codegeex4-all-9b.Q3_K_S.gguf
Safe
4.59 GB
xet
Upload codegeex4-all-9b.Q3_K_S.gguf with huggingface_hub
about 1 year ago
codegeex4-all-9b.Q4_0.gguf
Safe
5.46 GB
xet
Upload codegeex4-all-9b.Q4_0.gguf with huggingface_hub
about 1 year ago
codegeex4-all-9b.Q4_1.gguf
Safe
6 GB
xet
Upload codegeex4-all-9b.Q4_1.gguf with huggingface_hub
about 1 year ago
codegeex4-all-9b.Q4_K_M.gguf
Safe
6.25 GB
xet
Upload codegeex4-all-9b.Q4_K_M.gguf with huggingface_hub
about 1 year ago
codegeex4-all-9b.Q5_0.gguf
Safe
6.55 GB
xet
Upload codegeex4-all-9b.Q5_0.gguf with huggingface_hub
about 1 year ago
codegeex4-all-9b.Q5_1.gguf
Safe
7.1 GB
xet
Upload codegeex4-all-9b.Q5_1.gguf with huggingface_hub
about 1 year ago
codegeex4-all-9b.Q5_K_M.gguf
Safe
7.14 GB
xet
Upload codegeex4-all-9b.Q5_K_M.gguf with huggingface_hub
about 1 year ago
codegeex4-all-9b.Q5_K_S.gguf
Safe
6.69 GB
xet
Upload codegeex4-all-9b.Q5_K_S.gguf with huggingface_hub
about 1 year ago
codegeex4-all-9b.Q6_K.gguf
Safe
8.26 GB
xet
Upload codegeex4-all-9b.Q6_K.gguf with huggingface_hub
about 1 year ago
codegeex4-all-9b.Q8_0.gguf
Safe
9.99 GB
xet
Upload codegeex4-all-9b.Q8_0.gguf with huggingface_hub
about 1 year ago