💨 QVikhr-2.5-1.5B-Instruct-r
Инструктивная модель на основе QVikhr-2.5-1.5B-Instruct-r, обученная на русскоязычном датасете ru Math.
Описание:
Vikhrmodels/QVikhr-2.5-1.5B-Instruct-r представляет собой языковую модель, прошедшую специализированное обучение с использованием метода RuMath.
Transformers
Авторы
- Sergei Bratchikov, NLP Wanderer, Vikhr Team
- Aleksandr Nikolich, Vikhr Team
- Nikolay Kompanets, LakoMoor, Vikhr Team
- Konstantin Korolev, Vikhr Team
@inproceedings{nikolich2024vikhr,
title={Vikhr: Advancing Open-Source Bilingual Instruction-Following Large Language Models for Russian and English},
author={Aleksandr Nikolich and Konstantin Korolev and Sergei Bratchikov and Nikolay Kompanets and Igor Kiselev and Artem Shelmanov},
booktitle={Proceedings of the 4th Workshop on Multilingual Representation Learning (MRL) @ EMNLP-2024},
year={2024},
publisher={Association for Computational Linguistics},
url={https://arxiv.org/pdf/2405.13929}
}
- Downloads last month
- 1,096
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API:
The model has no pipeline_tag.
Model tree for Vikhrmodels/QVikhr-2.5-1.5B-Instruct-r_GGUF
Base model
Qwen/Qwen2.5-1.5B
Finetuned
Vikhrmodels/QVikhr-2.5-1.5B-Instruct-r