# 13 billion parameters
Persianllama 13B
The first groundbreaking large language model for Persian, with 13 billion parameters, trained on the Persian Wikipedia corpus, specifically designed for various natural language processing tasks.
Large Language Model
Transformers Other

P
ViraIntelligentDataMining
3,291
11
Spivavtor Xxl
Spivavtor-xxl is an instruction fine-tuned Ukrainian text editing model based on the CohereForAI/aya-101 model, focusing on tasks such as text rewriting, simplification, and grammar correction.
Large Language Model
Transformers Other

S
grammarly
43
4
Towerinstruct 13B V0.1 GGUF
TowerInstruct-13B is a 13-billion-parameter language model fine-tuned from the TowerBase model on the TowerBlocks supervised fine-tuning dataset, specifically designed for handling various translation-related tasks.
Large Language Model Supports Multiple Languages
T
LoneStriker
57
9
Stockmark 13b Instruct
MIT
Stockmark-13b-instruct is a large Japanese language model developed by Stockmark Inc., with 13 billion parameters and optimized through instruction fine-tuning.
Large Language Model
Transformers Japanese

S
stockmark
2,244
10
Featured Recommended AI Models