---
base_model: Qwen/Qwen2.5-3B-Instruct
language:
- en
library_name: transformers
license: other
license_name: qwen-research
license_link: https://huggingface.co/Qwen/Qwen2.5-3B-Instruct/blob/main/LICENSE
pipeline_tag: text-generation
tags:
- chat
- openvino
- openvino-export
---

This model was converted to OpenVINO from [`Qwen/Qwen2.5-3B-Instruct`](https://huggingface.co/Qwen/Qwen2.5-3B-Instruct) using [optimum-intel](https://github.com/huggingface/optimum-intel)
via the [export](https://huggingface.co/spaces/echarlaix/openvino-export) space.

First make sure you have optimum-intel installed:

```bash
pip install optimum[openvino]
```

To load your model you can do as follows:

```python
from optimum.intel import OVModelForCausalLM

model_id = "HelloSun/Qwen2.5-3B-Instruct-openvino"
model = OVModelForCausalLM.from_pretrained(model_id)
```