File size: 1,959 Bytes
7ac1872
 
 
5bdb43f
7ac1872
 
 
 
 
5bdb43f
0f454d1
7ac1872
 
5bdb43f
7ac1872
5bdb43f
 
eb3fa52
7ac1872
5bdb43f
7ac1872
5bdb43f
 
7ac1872
5bdb43f
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
e21a75d
 
 
 
 
7ac1872
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
---
base_model: unsloth/Qwen2.5-Coder-7B-Instruct-bnb-4bit
library_name: transformers
model_name: onekq-ai/OneSQL-v0.1-Qwen-7B
tags:
- generated_from_trainer
- unsloth
- trl
- sft
licence: apache-2.0
pipeline_tag: text-generation
---

# Introduction

This model specializes on the Text-to-SQL task. It is finetuned from the quantized version of [Qwen2.5-Coder-7B-Instruct](https://huggingface.co/Qwen/Qwen2.5-Coder-7B-Instruct).
Its sibling [32B model](https://huggingface.co/onekq-ai/OneSQL-v0.1-Qwen-32B) has an EX score of **63.33** and R-VES score of **60.02** on the [BIRD leaderboard](https://bird-bench.github.io/).
The self-evaluation EX score of this model is **56.19**.

# Quick start

To use this model, craft your prompt to start with your database schema in the form of **CREATE TABLE**, followed by your natural language query preceded by **--**.
Make sure your prompt ends with **SELECT** in order for the model to finish the query for you.

```python
from transformers import AutoModelForCausalLM, AutoTokenizer, pipeline
from peft import PeftModel

model_name = "unsloth/Qwen2.5-Coder-7B-Instruct-bnb-4bit"
adapter_name = "onekq-ai/OneSQL-v0.1-Qwen-7B"
tokenizer = AutoTokenizer.from_pretrained(model_name)
tokenizer.padding_side = "left"
model = PeftModel.from_pretrained(AutoModelForCausalLM.from_pretrained(model_name, device_map="auto"), adapter_name).to("cuda")

generator = pipeline("text-generation", model=model, tokenizer=tokenizer, return_full_text=False)

prompt = """
CREATE TABLE students (
    id INTEGER PRIMARY KEY,
    name TEXT,
    age INTEGER,
    grade TEXT
);

-- Find the three youngest students
SELECT """

result = generator(f"<|im_start|>system\nYou are a SQL expert. Return code only.<|im_end|>\n<|im_start|>user\n{prompt}<|im_end|>\n<|im_start|>assistant\n")[0]
print(result["generated_text"])
```

The model response is the finished SQL query without **SELECT**
```sql
* FROM students ORDER BY age ASC LIMIT 3
```