Update README.md
Browse files
README.md
CHANGED
@@ -43,12 +43,12 @@ base_model:
|
|
43 |
|
44 |
> The logo was generated by MidJourney
|
45 |
|
|
|
46 |
Sailor2 is a community-driven initiative that brings cutting-edge multilingual language models to South-East Asia (SEA).
|
47 |
Our research highlights a strong demand for models in the **8B and 20B parameter** range for production use, alongside **1B models** for specialized applications,
|
48 |
such as speculative decoding and research purposes.
|
49 |
These models, released under the **Apache 2.0 license**, provide enhanced accessibility to advanced language technologies across the region.
|
50 |
|
51 |
-
|
52 |
Sailor2 builds upon the foundation of the awesome multilingual model [Qwen 2.5](https://huggingface.co/collections/Qwen/qwen25-66e81a666513e518adb90d9e) and
|
53 |
is continuously pre-trained on **500B tokens** to support **15 languages** better with a unified model.
|
54 |
These languages include English, Chinese, Burmese, Cebuano, Ilocano, Indonesian, Javanese, Khmer, Lao, Malay, Sundanese, Tagalog, Thai, Vietnamese, and Waray.
|
@@ -59,7 +59,7 @@ The Sailor2 model comes in three sizes, 1B, 8B, and 20B, which are **expanded fr
|
|
59 |
- **Model Collections:** [Base Model & Chat Model](https://huggingface.co/collections/sail/sailor2-language-models-674d7c9e6b4dbbd9a869906b)
|
60 |
- **Project Website:** [sea-sailor.github.io/blog/sailor2/](https://sea-sailor.github.io/blog/sailor2/)
|
61 |
- **Codebase:** [github.com/sail-sg/sailor2](https://github.com/sail-sg/sailor2)
|
62 |
-
- **Technical Report:**
|
63 |
|
64 |
|
65 |
## Training details
|
@@ -87,7 +87,7 @@ from transformers import AutoModelForCausalLM, AutoTokenizer
|
|
87 |
device = "cuda"
|
88 |
|
89 |
model = AutoModelForCausalLM.from_pretrained(
|
90 |
-
'sail/Sailor2-
|
91 |
torch_dtype=torch.bfloat16,
|
92 |
device_map="auto"
|
93 |
)
|
@@ -138,10 +138,11 @@ No restrict on the research and the commercial use.
|
|
138 |
If you find Sailor2 useful, please cite our work as follows:
|
139 |
|
140 |
```
|
141 |
-
@
|
142 |
title = {Sailor2: Sailing in South-East Asia with Inclusive Multilingual LLM},
|
143 |
author = {Longxu Dou and Qian Liu and Fan Zhou and Changyu Chen and Zili Wang and Ziqi Jin and Zichen Liu and Tongyao Zhu and Cunxiao Du and Penghui Yang and Haonan Wang and Jiaheng Liu and Yongchi Zhao and Xiachong Feng and Xin Mao and Man Tsung Yeung and Kunat Pipatanakul and Fajri Koto and Min Si Thu and Hynek Kydl{\'\i}{\v{c}}ek and Zeyi Liu and Qunshu Lin and Sittipong Sripaisarnmongkol and Kridtaphad Sae-Khow and Nirattisai Thongchim and Taechawat Konkaew and Narong Borijindargoon and Anh Dao and Matichon Maneegard and Phakphum Artkaew and Zheng-Xin Yong and Quan Nguyen and Wannaphong Phatthiyaphaibun and Hoang H. Tran and Mike Zhang and Shiqi Chen and Tianyu Pang and Chao Du and Xinyi Wan and Wei Lu and Min Lin},
|
144 |
-
|
|
|
145 |
}
|
146 |
```
|
147 |
|
|
|
43 |
|
44 |
> The logo was generated by MidJourney
|
45 |
|
46 |
+
|
47 |
Sailor2 is a community-driven initiative that brings cutting-edge multilingual language models to South-East Asia (SEA).
|
48 |
Our research highlights a strong demand for models in the **8B and 20B parameter** range for production use, alongside **1B models** for specialized applications,
|
49 |
such as speculative decoding and research purposes.
|
50 |
These models, released under the **Apache 2.0 license**, provide enhanced accessibility to advanced language technologies across the region.
|
51 |
|
|
|
52 |
Sailor2 builds upon the foundation of the awesome multilingual model [Qwen 2.5](https://huggingface.co/collections/Qwen/qwen25-66e81a666513e518adb90d9e) and
|
53 |
is continuously pre-trained on **500B tokens** to support **15 languages** better with a unified model.
|
54 |
These languages include English, Chinese, Burmese, Cebuano, Ilocano, Indonesian, Javanese, Khmer, Lao, Malay, Sundanese, Tagalog, Thai, Vietnamese, and Waray.
|
|
|
59 |
- **Model Collections:** [Base Model & Chat Model](https://huggingface.co/collections/sail/sailor2-language-models-674d7c9e6b4dbbd9a869906b)
|
60 |
- **Project Website:** [sea-sailor.github.io/blog/sailor2/](https://sea-sailor.github.io/blog/sailor2/)
|
61 |
- **Codebase:** [github.com/sail-sg/sailor2](https://github.com/sail-sg/sailor2)
|
62 |
+
- **Technical Report:** [Sailor2 Report](https://arxiv.org/pdf/2502.12982)
|
63 |
|
64 |
|
65 |
## Training details
|
|
|
87 |
device = "cuda"
|
88 |
|
89 |
model = AutoModelForCausalLM.from_pretrained(
|
90 |
+
'sail/Sailor2-20B-Chat',
|
91 |
torch_dtype=torch.bfloat16,
|
92 |
device_map="auto"
|
93 |
)
|
|
|
138 |
If you find Sailor2 useful, please cite our work as follows:
|
139 |
|
140 |
```
|
141 |
+
@article{sailor2report,
|
142 |
title = {Sailor2: Sailing in South-East Asia with Inclusive Multilingual LLM},
|
143 |
author = {Longxu Dou and Qian Liu and Fan Zhou and Changyu Chen and Zili Wang and Ziqi Jin and Zichen Liu and Tongyao Zhu and Cunxiao Du and Penghui Yang and Haonan Wang and Jiaheng Liu and Yongchi Zhao and Xiachong Feng and Xin Mao and Man Tsung Yeung and Kunat Pipatanakul and Fajri Koto and Min Si Thu and Hynek Kydl{\'\i}{\v{c}}ek and Zeyi Liu and Qunshu Lin and Sittipong Sripaisarnmongkol and Kridtaphad Sae-Khow and Nirattisai Thongchim and Taechawat Konkaew and Narong Borijindargoon and Anh Dao and Matichon Maneegard and Phakphum Artkaew and Zheng-Xin Yong and Quan Nguyen and Wannaphong Phatthiyaphaibun and Hoang H. Tran and Mike Zhang and Shiqi Chen and Tianyu Pang and Chao Du and Xinyi Wan and Wei Lu and Min Lin},
|
144 |
+
journal={arXiv preprint arXiv:2502.12982},
|
145 |
+
year = {2025}
|
146 |
}
|
147 |
```
|
148 |
|