Wajdi1976 commited on
Commit
2ca1d1d
ยท
verified ยท
1 Parent(s): 6d11f40

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +14 -1
README.md CHANGED
@@ -34,7 +34,7 @@ max_seq_length = 600 # Choose any! We auto support RoPE Scaling internally!
34
  dtype = None # None for auto detection. Float16 for Tesla T4, V100, Bfloat16 for Ampere+
35
  load_in_4bit = True # Use 4bit quantization to reduce memory usage. Can be False.
36
  model, tokenizer = FastLanguageModel.from_pretrained(
37
- model_name = "Wajdi1976/jais_arabic_tunisien_derija",
38
  max_seq_length = max_seq_length,
39
  dtype = dtype,
40
  load_in_4bit = load_in_4bit,
@@ -75,6 +75,19 @@ print(get_response(text))
75
  ```
76
  - Response: ุจุงู„ุทุจุน ู†ุฌู… ู†ุฌุงูˆุจ ุนู„ู‰ ุณุคุงู„ูƒ ูˆุงู„ู„ู‡ุฌู‡ ุงู„ุชูˆู†ุณูŠุฉ ู‡ูŠ ู†ูˆุน ู…ู† ุงู„ู„ู‡ุฌุงุช ุงู„ู„ูŠ ุชุชูƒู„ู… ููŠ ุจู„ุงุฏ ุงุณู…ู‡ุง ุชูˆู†ุณ ูƒูŠูู…ุง ูู…ุง ุจุฑุดุง ู„ู‡ุฌุงุช ู…ุฎุชู„ูุฉ ูƒูŠู…ุง ุงู„ุฅู†ุฌู„ูŠุฒูŠุฉ ุฃูˆ ุงู„ุฅุณุจุงู†ูŠุฉ ุงู„ู†ุงุณ ููŠ ุงู„ุนุงู„ู… ูŠุชูƒู„ู…ูˆุง ู„ุบุงุช ู…ุชู†ูˆุนุฉ ุฃู…ุง ุงู„ู„ู‡ุฌุฉ ุงู„ุชูˆู†ุณู‰ ู‡ูŠุง ุงู„ุทุฑูŠู‚ุฉ ุงู„ุฎุงุตุฉ ุจุงู„ูƒู„ุงู… ู„ู„ู†ุงุณ ููŠ ุงู„ุจู„ุงุฏ ู‡ุฐูŠูƒ ูŠุนู†ูŠ ูƒุงู† ุชุณุฃู„ู†ูŠ ุณุคุงู„ ุจุงู„ู‡ุฌุฌู‡ ุชูˆู†ุณูŠ ู†ุญุจ ู†ุนุงูˆู†ูƒ ุจุงุด ู†ูู‡ู…ูˆู‡
77
 
 
 
 
 
 
 
 
 
 
 
 
 
 
78
  This llama model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library.
79
 
80
  [<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
 
34
  dtype = None # None for auto detection. Float16 for Tesla T4, V100, Bfloat16 for Ampere+
35
  load_in_4bit = True # Use 4bit quantization to reduce memory usage. Can be False.
36
  model, tokenizer = FastLanguageModel.from_pretrained(
37
+ model_name = "Wajdi1976/Labess",
38
  max_seq_length = max_seq_length,
39
  dtype = dtype,
40
  load_in_4bit = load_in_4bit,
 
75
  ```
76
  - Response: ุจุงู„ุทุจุน ู†ุฌู… ู†ุฌุงูˆุจ ุนู„ู‰ ุณุคุงู„ูƒ ูˆุงู„ู„ู‡ุฌู‡ ุงู„ุชูˆู†ุณูŠุฉ ู‡ูŠ ู†ูˆุน ู…ู† ุงู„ู„ู‡ุฌุงุช ุงู„ู„ูŠ ุชุชูƒู„ู… ููŠ ุจู„ุงุฏ ุงุณู…ู‡ุง ุชูˆู†ุณ ูƒูŠูู…ุง ูู…ุง ุจุฑุดุง ู„ู‡ุฌุงุช ู…ุฎุชู„ูุฉ ูƒูŠู…ุง ุงู„ุฅู†ุฌู„ูŠุฒูŠุฉ ุฃูˆ ุงู„ุฅุณุจุงู†ูŠุฉ ุงู„ู†ุงุณ ููŠ ุงู„ุนุงู„ู… ูŠุชูƒู„ู…ูˆุง ู„ุบุงุช ู…ุชู†ูˆุนุฉ ุฃู…ุง ุงู„ู„ู‡ุฌุฉ ุงู„ุชูˆู†ุณู‰ ู‡ูŠุง ุงู„ุทุฑูŠู‚ุฉ ุงู„ุฎุงุตุฉ ุจุงู„ูƒู„ุงู… ู„ู„ู†ุงุณ ููŠ ุงู„ุจู„ุงุฏ ู‡ุฐูŠูƒ ูŠุนู†ูŠ ูƒุงู† ุชุณุฃู„ู†ูŠ ุณุคุงู„ ุจุงู„ู‡ุฌุฌู‡ ุชูˆู†ุณูŠ ู†ุญุจ ู†ุนุงูˆู†ูƒ ุจุงุด ู†ูู‡ู…ูˆู‡
77
 
78
+ ## Citations
79
+ When using the **Tunisian Derja Dataset** dataset, please cite:
80
+
81
+ ```bibtex
82
+ @model{linagora2025LLM-tn,
83
+ author = {Wajdi Ghezaiel and Jean-Pierre Lorrรฉ},
84
+ title = {Labess:Tunisian Derja LLM},
85
+ year = {2025},
86
+ month = {January},
87
+ url = {https://huggingface.co/datasets/Wajdi1976/Labess}
88
+ }
89
+
90
+ ```
91
  This llama model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library.
92
 
93
  [<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)