File size: 2,570 Bytes
df0f08e
 
 
 
 
 
 
 
 
7ac6a8e
0fa97f8
 
 
df0f08e
 
 
 
555d931
df0f08e
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
494ee23
df0f08e
 
 
 
 
fdca554
 
 
 
 
 
 
 
df0f08e
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
---
license: llama3.2
pipeline_tag: text-generation
---

**Llama3.2-Typhoon2-1B**: Thai Large Language Model (Instruct)

**Llama3.2-Typhoon2-1B** is a pretrained only Thai ๐Ÿ‡น๐Ÿ‡ญ large language model with 3 billion parameters, and it is based on Llama3.2-3B.

For technical-report. please see our [arxiv](https://arxiv.org/abs/2412.13702).
*To acknowledge Meta's effort in creating the foundation model and to comply with the license, we explicitly include "llama-3.2" in the model name.

## **Performance**

| Model                 | ThaiExam  | ONET     | IC        | A-Level   | TGAT      | TPAT      | M3Exam    | Math       | Science    | Social     | Thai       |
|------------------------|-----------|----------|-----------|-----------|-----------|-----------|-----------|------------|------------|------------|------------|
| **Typhoon2 Llama3.2 3B Base**   | **44.53%** | **40.12%** | 40.00%    | **26.77%** | **69.23%** | **46.55%** | **41.84%** | **24.43%** | **41.30%** | **60.07%** | **41.56%** |
| **Llama3.2 3B**        | 40.42%    | 30.86%   | **46.31%** | 20.47%    | 63.07%    | 41.37%    | 36.81%    | 21.71%     | 36.23%     | 50.74%     | 38.54%     |


## **Model Description**

- **Model type**: A 3B decoder-only model based on Llama architecture.
- **Requirement**: transformers 4.45.0 or newer.
- **Primary Language(s)**: Thai ๐Ÿ‡น๐Ÿ‡ญ and English ๐Ÿ‡ฌ๐Ÿ‡ง
- **License**: [Llama 3.2 Community License](https://github.com/meta-llama/llama-models/blob/main/models/llama3_2/LICENSE)


## **Intended Uses & Limitations**

This model is a pretrained base model. Thus, it may not be able to follow human instructions without using one/few-shot learning or instruction fine-tuning. The model does not have any moderation mechanisms, and may generate harmful or inappropriate responses.

## **Follow us**

**https://twitter.com/opentyphoon**

## **Support**

**https://discord.gg/us5gAYmrxw**

## **Citation**

- If you find Typhoon2 useful for your work, please cite it using:
```
@misc{typhoon2,
      title={Typhoon 2: A Family of Open Text and Multimodal Thai Large Language Models}, 
      author={Kunat Pipatanakul and Potsawee Manakul and Natapong Nitarach and Warit Sirichotedumrong and Surapon Nonesung and Teetouch Jaknamon and Parinthapat Pengpun and Pittawat Taveekitworachai and Adisai Na-Thalang and Sittipong Sripaisarnmongkol and Krisanapong Jirayoot and Kasima Tharnpipitchai},
      year={2024},
      eprint={2412.13702},
      archivePrefix={arXiv},
      primaryClass={cs.CL},
      url={https://arxiv.org/abs/2412.13702}, 
}
```