💨 QVikhr-2.5-1.5B-Instruct-r

Инструктивная модель на основе QVikhr-2.5-1.5B-Instruct-r, обученная на русскоязычном датасете ru Math.

Описание:

Vikhrmodels/QVikhr-2.5-1.5B-Instruct-r представляет собой языковую модель, прошедшую специализированное обучение с использованием метода RuMath.

Transformers

Авторы

@inproceedings{nikolich2024vikhr,
  title={Vikhr: Advancing Open-Source Bilingual Instruction-Following Large Language Models for Russian and English},
  author={Aleksandr Nikolich and Konstantin Korolev and Sergei Bratchikov and Nikolay Kompanets and Igor Kiselev and Artem Shelmanov},
  booktitle={Proceedings of the 4th Workshop on Multilingual Representation Learning (MRL) @ EMNLP-2024},
  year={2024},
  publisher={Association for Computational Linguistics},
  url={https://arxiv.org/pdf/2405.13929}
}
Downloads last month
1,096
GGUF
Model size
1.54B params
Architecture
qwen2

1-bit

2-bit

3-bit

4-bit

5-bit

6-bit

8-bit

16-bit

Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no pipeline_tag.

Model tree for Vikhrmodels/QVikhr-2.5-1.5B-Instruct-r_GGUF

Base model

Qwen/Qwen2.5-1.5B
Quantized
(9)
this model