A newer version of this model is available: nomic-ai/nomic-embed-text-v2-moe

nomic-embed-text-v2-moe-unsupervised

nomic-embed-text-v2-moe-unsupervised is multilingual MoE Text Embedding model. This is a checkpoint after contrastive pretraining from multi-stage contrastive training of the final model.

If you want to use a model to extract embeddings, we suggest using nomic-embed-text-v2-moe

Join the Nomic Community

Downloads last month
34
Safetensors
Model size
475M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for nomic-ai/nomic-embed-text-v2-moe-unsupervised

Finetuned
(1)
this model
Finetunes
1 model

Collection including nomic-ai/nomic-embed-text-v2-moe-unsupervised