|
--- |
|
base_model: inceptionai/jais-adapted-7b-chat |
|
language: |
|
- ar |
|
license: apache-2.0 |
|
tags: |
|
- text-generation-inference |
|
- transformers |
|
- unsloth |
|
- llama |
|
- trl |
|
datasets: |
|
- linagora/Tunisian_Derja_Dataset |
|
library_name: transformers |
|
--- |
|
## Model Overview |
|
|
|
Labess-7b-chat is an open model instruction-tuned for Tunisian Derja, it's a continual pre-training version of jais-adapted-7b-chat with tunisian_Derja_Dataset |
|
# Uploaded model |
|
|
|
- **Developed by:** Linagora |
|
- **License:** apache-2.0 |
|
- **Finetuned from model :** inceptionai/jais-adapted-7b-chat |
|
## Usage |
|
Below we share some code snippets on how to get quickly started with running the model. First, install the Transformers library with: |
|
|
|
```sh |
|
pip install transformers |
|
``` |
|
# Usage |
|
```python |
|
import torch |
|
from transformers import pipeline |
|
|
|
pipe = pipeline( |
|
"text-generation", |
|
model="linagora/Labess-7b-chat-16bit", |
|
model_kwargs={"torch_dtype": torch.bfloat16}, |
|
device="cuda" # replace with "mps" to run on a Mac device |
|
) |
|
|
|
messages = [ |
|
{"role": "user", "content": 'وين تجي تونس؟'}, |
|
] |
|
|
|
outputs = pipe(messages, max_new_tokens=64, do_sample=True, temperature=0.2) |
|
assistant_response = outputs[0]["generated_text"][-1]["content"].strip() |
|
print(assistant_response) |
|
``` |
|
``` |
|
- Response:تونس هي بلاد في شمال إفريقيا هي بلاد جميلة برشة ومعروفة في العالم الكل هي بلاد فيها مناظر طبيعية |
|
``` |
|
## Citations |
|
When using this model **Labess-7b-chat**, please cite: |
|
|
|
```bibtex |
|
@model{linagora2025LLM-tn, |
|
author = {Wajdi Ghezaiel and Jean-Pierre Lorré}, |
|
title = {Labess-7b-chat:Tunisian Derja LLM}, |
|
year = {2025}, |
|
month = {January}, |
|
url = {https://huggingface.co/datasets/Wajdi1976/Labess-7b-chat} |
|
} |
|
|
|
``` |
|
[<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth) |