File size: 2,090 Bytes
5c8b5f9
 
 
 
 
021a9ec
 
 
5c8b5f9
f6331ac
5c8b5f9
 
 
f68db4f
5c8b5f9
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
e75fb52
5c8b5f9
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
---
license: llama3
language:
- th
- en
pipeline_tag: text-generation
tags:
- pretrained
---
**Llama-3-Typhoon-1.5-8B: Thai Large Language Model (Pretrained)**

**Typhoon-8B** is a *pretrained only* Thai 🇹🇭 large language model with 8 billion parameters, and it is based on Llama3-8B.

For release notes, please see our [blog](https://blog.opentyphoon.ai/typhoon-1-5-release-a9364cb8e8d7). *To acknowledge Meta's effort in creating the foundation model and to comply with the license, we explicitly include "llama-3" in the model name.

## **Model Description**

- **Model type**: A 8B pretrained decoder-only model based on Llama architecture.
- **Requirement**: transformers 4.38.0 or newer.
- **Primary Language(s)**: Thai 🇹🇭 and English 🇬🇧
- **License**: [Llama 3 Community License](https://llama.meta.com/llama3/license/)

## **Intended Uses & Limitations**

This model is a pretrained base model. Thus, it may not be able to follow human instructions without using one/few-shot learning or instruction fine-tuning. The model does not have any moderation mechanisms, and may generate harmful or inappropriate responses.

## **Follow us**

**https://twitter.com/opentyphoon**

## **Support**

**https://discord.gg/CqyBscMFpg**

## **SCB10X AI Team**

- Kunat Pipatanakul, Potsawee Manakul, Sittipong Sripaisarnmongkol, Natapong Nitarach, Pathomporn Chokchainant, Kasima Tharnpipitchai
- If you find Typhoon-8B useful for your work, please cite it using:

```
@article{pipatanakul2023typhoon,
    title={Typhoon: Thai Large Language Models}, 
    author={Kunat Pipatanakul and Phatrasek Jirabovonvisut and Potsawee Manakul and Sittipong Sripaisarnmongkol and Ruangsak Patomwong and Pathomporn Chokchainant and Kasima Tharnpipitchai},
    year={2023},
    journal={arXiv preprint arXiv:2312.13951},
    url={https://arxiv.org/abs/2312.13951}
}
```

## **Contact Us**

- General & Collaboration: **[[email protected]](mailto:[email protected])****[[email protected]](mailto:[email protected])**
- Technical: **[[email protected]](mailto:[email protected])**