Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
dranger003
/
c4ai-command-r-plus-iMat.GGUF
like
140
Text Generation
GGUF
imatrix
conversational
License:
cc-by-nc-4.0
Model card
Files
Files and versions
Community
20
Deploy
Use this model
a234053
c4ai-command-r-plus-iMat.GGUF
Ctrl+K
Ctrl+K
2 contributors
History:
170 commits
dranger003
Upload folder using huggingface_hub
a234053
verified
about 1 year ago
.gitattributes
6.8 kB
Upload folder using huggingface_hub
about 1 year ago
README.md
Safe
5.3 kB
Update README.md
over 1 year ago
ggml-c4ai-command-r-plus-104b-ppl.png
Safe
434 kB
Upload ggml-c4ai-command-r-plus-104b-ppl.png
over 1 year ago
ggml-c4ai-command-r-plus-f16-00001-of-00005.gguf
Safe
49.5 GB
LFS
Upload folder using huggingface_hub
over 1 year ago
ggml-c4ai-command-r-plus-f16-00002-of-00005.gguf
Safe
49.7 GB
LFS
Upload folder using huggingface_hub
over 1 year ago
ggml-c4ai-command-r-plus-f16-00003-of-00005.gguf
Safe
49.5 GB
LFS
Upload folder using huggingface_hub
over 1 year ago
ggml-c4ai-command-r-plus-f16-00004-of-00005.gguf
Safe
49.5 GB
LFS
Upload folder using huggingface_hub
over 1 year ago
ggml-c4ai-command-r-plus-f16-00005-of-00005.gguf
Safe
9.44 GB
LFS
Upload folder using huggingface_hub
over 1 year ago
ggml-c4ai-command-r-plus-f16-imatrix.dat
27.5 MB
LFS
Upload folder using huggingface_hub
over 1 year ago
ggml-c4ai-command-r-plus-iq4_nl-00001-of-00002.gguf
Safe
49.8 GB
LFS
Upload folder using huggingface_hub
about 1 year ago
ggml-c4ai-command-r-plus-iq4_nl-00002-of-00002.gguf
Safe
9.52 GB
LFS
Upload folder using huggingface_hub
about 1 year ago
ggml-c4ai-command-r-plus-iq4_xs-00001-of-00002.gguf
Safe
49.7 GB
LFS
Upload folder using huggingface_hub
about 1 year ago
ggml-c4ai-command-r-plus-iq4_xs-00002-of-00002.gguf
Safe
6.48 GB
LFS
Upload folder using huggingface_hub
about 1 year ago
ggml-c4ai-command-r-plus-q4_k_m-00001-of-00002.gguf
Safe
49.7 GB
LFS
Upload folder using huggingface_hub
about 1 year ago
ggml-c4ai-command-r-plus-q4_k_m-00002-of-00002.gguf
Safe
13 GB
LFS
Upload folder using huggingface_hub
about 1 year ago
ggml-c4ai-command-r-plus-q4_k_s-00001-of-00002.gguf
Safe
49.7 GB
LFS
Upload folder using huggingface_hub
about 1 year ago
ggml-c4ai-command-r-plus-q4_k_s-00002-of-00002.gguf
Safe
9.97 GB
LFS
Upload folder using huggingface_hub
about 1 year ago
ggml-c4ai-command-r-plus-q5_k_m-00001-of-00002.gguf
Safe
49.7 GB
LFS
Upload folder using huggingface_hub
about 1 year ago
ggml-c4ai-command-r-plus-q5_k_m-00002-of-00002.gguf
Safe
24 GB
LFS
Upload folder using huggingface_hub
about 1 year ago
ggml-c4ai-command-r-plus-q5_k_s-00001-of-00002.gguf
Safe
49.6 GB
LFS
Upload folder using huggingface_hub
about 1 year ago
ggml-c4ai-command-r-plus-q5_k_s-00002-of-00002.gguf
Safe
22.2 GB
LFS
Upload folder using huggingface_hub
about 1 year ago
ggml-c4ai-command-r-plus-q6_k-00001-of-00002.gguf
Safe
49.7 GB
LFS
Upload folder using huggingface_hub
about 1 year ago
ggml-c4ai-command-r-plus-q6_k-00002-of-00002.gguf
Safe
35.4 GB
LFS
Upload folder using huggingface_hub
about 1 year ago
ggml-c4ai-command-r-plus-q8_0-00001-of-00003.gguf
Safe
49.8 GB
LFS
Upload folder using huggingface_hub
about 1 year ago
ggml-c4ai-command-r-plus-q8_0-00002-of-00003.gguf
Safe
49.7 GB
LFS
Upload folder using huggingface_hub
about 1 year ago
ggml-c4ai-command-r-plus-q8_0-00003-of-00003.gguf
Safe
10.8 GB
LFS
Upload folder using huggingface_hub
about 1 year ago