Model Provider Deepseek

Deepseek
DeepSeek is an AI enterprise focused on developing large language models, with a focus on improving performance in vertical domains.
Published Models:18
Deepseek R1 (Jan '25)
deepseek

¥4.03
Input tokens/million
¥15.98
Output tokens/million
128k
Context Length
Deepseek R1 0528 (May '25)
deepseek

¥3.6
Input tokens/million
¥15.48
Output tokens/million
128k
Context Length
Deepseek R1 0528 Qwen3 8B
deepseek

¥3.6
Input tokens/million
¥15.48
Output tokens/million
128k
Context Length
Deepseek V3
deepseek

¥1.94
Input tokens/million
¥7.92
Output tokens/million
128k
Context Length
Deepseek R1 Distill Qwen 14B
deepseek

-
Input tokens/million
-
Output tokens/million
128k
Context Length
Deepseek R1 Distill Llama 70B
deepseek

¥0.72
Input tokens/million
¥2.88
Output tokens/million
128k
Context Length
Deepseek V2.5
deepseek

¥1.01
Input tokens/million
¥2.02
Output tokens/million
128k
Context Length
Deepseek V3 (Dec '24)
deepseek

¥1.94
Input tokens/million
¥7.92
Output tokens/million
128k
Context Length
Deepseek V2 Chat
deepseek

-
Input tokens/million
-
Output tokens/million
128k
Context Length
Deepseek Coder V2 Lite Instruct
deepseek

-
Input tokens/million
-
Output tokens/million
128k
Context Length
Deepseek R1 Distill Qwen 32B
deepseek

¥0.86
Input tokens/million
¥1.3
Output tokens/million
128k
Context Length
Deepseek V2.5 (Dec '24)
deepseek

¥1.01
Input tokens/million
¥2.02
Output tokens/million
128k
Context Length
Deepseek LLM 67B Chat (V1)
deepseek

¥3.96
Input tokens/million
¥15.77
Output tokens/million
4096
Context Length
Deepseek R1 Distill Llama 8B
deepseek

-
Input tokens/million
-
Output tokens/million
128k
Context Length
Deepseek Coder V2
deepseek

-
Input tokens/million
-
Output tokens/million
128k
Context Length
Deepseek V3 0324 (Mar' 25)
deepseek

-
Input tokens/million
-
Output tokens/million
128k
Context Length
Deepseek R1 Distill Qwen 1.5B
deepseek

-
Input tokens/million
-
Output tokens/million
128k
Context Length
Deepseek R1
deepseek

¥3.96
Input tokens/million
¥15.77
Output tokens/million
128k
Context Length