Model trained on 800,000 Japanese sentences after reducing oshizo/japanese-e5-mistral-7b_slerp to 8 layers.
See this article for details(Japanese)
https://note.com/oshizo/n/n9140df790315

See intfloat/e5-mistral-7b-instruct page for model usage.

Downloads last month
31
Safetensors
Model size
1.88B params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no pipeline_tag.

Datasets used to train oshizo/japanese-e5-mistral-1.9b

Collection including oshizo/japanese-e5-mistral-1.9b