🚀 🇹🇭 OpenThaiGPT R1 32b
OpenThaiGPT R1 32b is an advanced Thai language reasoning model with 32 billion parameters. It outperforms larger models like DeepSeek R1 70b and Typhoon R1 70b, despite being less than half their size. This model excels in complex reasoning tasks, including mathematics, logic, and code reasoning in Thai.
More Info
✨ Features
- State-of-the-art Thai reasoning model: Outperforms larger models in mathematical and logical reasoning tasks.
- Explicit reasoning capabilities: Can show step-by-step thought processes.
- Significantly smaller size: With 32 billion parameters, it outperforms 70b models.
- Specialized for Thai language reasoning: Handles complex mathematics and logic problems in Thai.
- High performance on code reasoning: Performs well in both Thai and English code reasoning.
📊 Benchmark Results
SkyThought |
OpenThaiGPT R1 32b |
DeepSeek R1 70b |
Typhoon R1 Distill 70b |
AIME24-TH |
56.67 |
33.33 |
53.33 |
AIME24 |
63.36 |
53.33 |
53.33 |
MATH500-TH |
83.8 |
75.4 |
81 |
MATH500 |
89.4 |
88.88 |
90.2 |
LiveCodeBench-TH |
62.16 |
53.15 |
47.75 |
LiveCodeBench |
69.67 |
64.97 |
54.79 |
OpenThaiEval |
76.05 |
74.17 |
77.59 |
AVERAGE |
71.58 |
63.31 |
65.42 |
📚 Documentation
Recommended System Prompt
<No system prompt>
Model Technical Report
https://arxiv.org/abs/2504.01789
If OpenThaiGPT has been beneficial for your work, kindly consider citing it as follows:
@misc{yuenyong2025openthaigpt16r1thaicentric,
title={OpenThaiGPT 1.6 and R1: Thai-Centric Open Source and Reasoning Large Language Models},
author={Sumeth Yuenyong and Thodsaporn Chay-intr and Kobkrit Viriyayudhakorn},
year={2025},
eprint={2504.01789},
archivePrefix={arXiv},
primaryClass={cs.CL},
url={https://arxiv.org/abs/2504.01789},
}
💻 Usage Examples
Online Web Interface
https://chindax.iapp.co.th
Transformers
from transformers import AutoModelForCausalLM, AutoTokenizer
model_name = "openthaigpt/openthaigpt-r1-32b-instruct"
model = AutoModelForCausalLM.from_pretrained(
model_name,
torch_dtype="auto",
device_map="auto"
)
tokenizer = AutoTokenizer.from_pretrained(model_name)
prompt = "จงหาพื้นที่ของวงกลมที่มีรัศมี 7 หน่วย"
messages = [
{"role": "user", "content": prompt}
]
text = tokenizer.apply_chat_template(
messages,
tokenize=False,
add_generation_prompt=True
)
model_inputs = tokenizer([text], return_tensors="pt").to(model.device)
generated_ids = model.generate(
**model_inputs,
max_new_tokens=16384,
temperature=0.6
)
generated_ids = [
output_ids[len(input_ids):] for input_ids, output_ids in zip(model_inputs.input_ids, generated_ids)
]
response = tokenizer.batch_decode(generated_ids, skip_special_tokens=True)[0]
vLLM
1. Install VLLM
https://github.com/vllm-project/vllm
2. Run server
vllm serve openthaigpt/openthaigpt-r1-32b --tensor-parallel-size 2
- Note, change
--tensor-parallel-size 2
to the amount of available GPU cards.
3. Run inference (CURL example)
curl -X POST 'http://127.0.0.1:8000/v1/chat/completions' \
-H 'Content-Type: application/json' \
-d '{
"model": "openthaigpt/openthaigpt-r1-32b-instruct",
"messages": [
{
"role": "user",
"content": "จงหาพื้นที่ของวงกลมที่มีรัศมี 7 หน่วย"
}
],
"max_tokens": 16384,
"temperature": 0.6,
"top_p": 0.95,
"top_k": 40
}'
GPU Memory Requirements
Number of Parameters |
FP 16 bits |
8 bits (Quantized) |
4 bits (Quantized) |
32b |
64 GB |
32 GB |
16 GB |
Chat Template
{% if not add_generation_prompt is defined %}{% set add_generation_prompt = false %}{% endif %}{% set ns = namespace(is_first=false, is_tool=false, is_output_first=true, system_prompt='') %}{%- for message in messages %}{%- if message['role'] == 'system' %}{% set ns.system_prompt = message['content'] %}{%- endif %}{%- endfor %}{{bos_token}}{{ns.system_prompt}}{%- for message in messages %}{%- if message['role'] == 'user' %}{%- set ns.is_tool = false -%}{{'<|User|>' + message['content']}}{%- endif %}{%- if message['role'] == 'assistant' and message['content'] is none %}{%- set ns.is_tool = false -%}{%- for tool in message['tool_calls']%}{%- if not ns.is_first %}{{'<|Assistant|><|tool▁calls▁begin|><|tool▁call▁begin|>' + tool['type'] + '<|tool▁sep|>' + tool['function']['name'] + '\\n' + '```json' + '\\n' + tool['function']['arguments'] + '\\n' + '```' + '<|tool▁call▁end|>'}}{%- set ns.is_first = true -%}{%- else %}{{'\\n' + '<|tool▁call▁begin|>' + tool['type'] + '<|tool▁sep|>' + tool['function']['name'] + '\\n' + '```json' + '\\n' + tool['function']['arguments'] + '\\n' + '```' + '<|tool▁call▁end|>'}}{{'<|tool▁calls▁end|><|end▁of▁sentence|>'}}{%- endif %}{%- endfor %}{%- endif %}{%- if message['role'] == 'assistant' and message['content'] is not none %}{%- if ns.is_tool %}{{'<|tool▁outputs▁end|>' + message['content'] + '<|end▁of▁sentence|>'}}{%- set ns.is_tool = false -%}{%- else %}{% set content = message['content'] %}{% if '</think>' in content %}{% set content = content.split('</think>')[-1] %}{% endif %}{{'<|Assistant|>' + content + '<|end▁of▁sentence|>'}}{%- endif %}{%- endif %}{%- if message['role'] == 'tool' %}{%- set ns.is_tool = true -%}{%- if ns.is_output_first %}{{'<|tool▁outputs▁begin|><|tool▁output▁begin|>' + message['content'] + '<|tool▁output▁end|>'}}{%- set ns.is_output_first = false %}{%- else %}{{'\\n<|tool▁output▁begin|>' + message['content'] + '<|tool▁output▁end|>'}}{%- endif %}{%- endif %}{%- endfor -%}{% if ns.is_tool %}{{'<|tool▁outputs▁end|>'}}{% endif %}{% if add_generation_prompt and not ns.is_tool %}{{'<|Assistant|>'}}{% endif %}
📄 License
This model is available for Research and Commercial uses under the specified terms. Please see the LICENSE file for more information.
🛠️ Supports
- Official website: https://openthaigpt.aieat.or.th
- Facebook page: https://web.facebook.com/groups/openthaigpt
- A Discord server for discussion and support here
- E-mail: kobkrit@iapp.co.th
OpenThaiGPT Team
- Kobkrit Viriyayudhakorn (kobkrit@iapp.co.th / kobkrit@aieat.or.th)
- Sumeth Yuenyong (sumeth.yue@mahidol.edu)
- Thodsaporn Chay-intr (thodsaporn@iapp.co.th)
💰 Sponsors
-
Supported by 8 Nvidia H100 GPUs from Siam AI Corporation Co., Ltd: https://siam.ai/
-
Received research funding from the National Science and Technology Development Fund, managed by the National Innovation Agency, in collaboration with IApp Technology Co., Ltd. The project is implemented by the Thai Artificial Intelligence Entrepreneurs Association.
Disclaimer: Provided responses are not guaranteed.