Nomic Embed v2
Collection
Multilingual Embedding Models
•
3 items
•
Updated
•
10
nomic-embed-text-v2-moe-unsupervised
is multilingual MoE Text Embedding model. This is a checkpoint after contrastive pretraining from multi-stage contrastive training of the
final model.
If you want to use a model to extract embeddings, we suggest using nomic-embed-text-v2-moe