# 7 Billion Parameters
Teuken 7B Instruct Research V0.4
Other
Teuken-7B-instruct-research-v0.4 is a 7-billion-parameter multilingual large language model fine-tuned for instructions, supporting 24 official EU languages, with a focus on European values and multilingual task scenarios.
Large Language Model
Transformers Supports Multiple Languages

T
openGPT-X
1,443
81
Qwen2 7B Multilingual RP
Apache-2.0
Qwen2-7B is a multilingual role-playing large language model that supports English, Korean, Japanese, Chinese, and Spanish, with a context length of 32k.
Large Language Model
Transformers Supports Multiple Languages

Q
maywell
646
57
Dictalm2.0
Apache-2.0
DictaLM-2.0 is a 7-billion-parameter pretrained generative text model, optimized for Hebrew, based on an improved Mistral-7B architecture
Large Language Model
Transformers Supports Multiple Languages

D
dicta-il
24.86k
14
Codellama 7b Python Hf
Code Llama is a series of 7 billion parameter code generation models developed by Meta, with this version specifically optimized for Python
Large Language Model
Transformers Other

C
meta-llama
2,271
22
Featured Recommended AI Models