Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
cortexso
/
gemma3
like
0
Follow
Cortex
63
Text Generation
GGUF
cortexp.cpp
featured
conversational
Model card
Files
Files and versions
Community
Deploy
Use this model
26a9631
gemma3
Ctrl+K
Ctrl+K
2 contributors
History:
5 commits
jan-hq
Upload folder using huggingface_hub
26a9631
verified
about 1 month ago
.gitattributes
2.76 kB
Upload folder using huggingface_hub
about 1 month ago
gemma-3-12b-it-q2_k.gguf
Safe
4.77 GB
LFS
Upload folder using huggingface_hub
about 1 month ago
gemma-3-12b-it-q3_k_l.gguf
Safe
6.48 GB
LFS
Upload folder using huggingface_hub
about 1 month ago
gemma-3-12b-it-q3_k_m.gguf
Safe
6.01 GB
LFS
Upload folder using huggingface_hub
about 1 month ago
gemma-3-12b-it-q3_k_s.gguf
Safe
5.46 GB
LFS
Upload folder using huggingface_hub
about 1 month ago
gemma-3-12b-it-q4_k_m.gguf
Safe
7.3 GB
LFS
Upload folder using huggingface_hub
about 1 month ago
gemma-3-12b-it-q4_k_s.gguf
Safe
6.94 GB
LFS
Upload folder using huggingface_hub
about 1 month ago
gemma-3-12b-it-q5_k_m.gguf
Safe
8.44 GB
LFS
Upload folder using huggingface_hub
about 1 month ago
gemma-3-12b-it-q5_k_s.gguf
Safe
8.23 GB
LFS
Upload folder using huggingface_hub
about 1 month ago
gemma-3-12b-it-q6_k.gguf
Safe
9.66 GB
LFS
Upload folder using huggingface_hub
about 1 month ago
gemma-3-12b-it-q8_0.gguf
Safe
12.5 GB
LFS
Upload folder using huggingface_hub
about 1 month ago
gemma-3-4b-it-q2_k.gguf
Safe
1.73 GB
LFS
Upload folder using huggingface_hub
about 1 month ago
gemma-3-4b-it-q3_k_l.gguf
Safe
2.24 GB
LFS
Upload folder using huggingface_hub
about 1 month ago
gemma-3-4b-it-q3_k_m.gguf
Safe
2.1 GB
LFS
Upload folder using huggingface_hub
about 1 month ago
gemma-3-4b-it-q3_k_s.gguf
Safe
1.94 GB
LFS
Upload folder using huggingface_hub
about 1 month ago
gemma-3-4b-it-q4_k_m.gguf
Safe
2.49 GB
LFS
Upload folder using huggingface_hub
about 1 month ago
gemma-3-4b-it-q4_k_s.gguf
Safe
2.38 GB
LFS
Upload folder using huggingface_hub
about 1 month ago
gemma-3-4b-it-q5_k_m.gguf
Safe
2.83 GB
LFS
Upload folder using huggingface_hub
about 1 month ago
gemma-3-4b-it-q5_k_s.gguf
Safe
2.76 GB
LFS
Upload folder using huggingface_hub
about 1 month ago
gemma-3-4b-it-q6_k.gguf
Safe
3.19 GB
LFS
Upload folder using huggingface_hub
about 1 month ago
gemma-3-4b-it-q8_0.gguf
Safe
4.13 GB
LFS
Upload folder using huggingface_hub
about 1 month ago
metadata.yml
47 Bytes
Upload metadata.yml with huggingface_hub
about 1 month ago
model.yml
Safe
853 Bytes
Upload model.yml with huggingface_hub
about 1 month ago