AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Text Completion

# Text Completion

Kyrgyzbert
Apache-2.0
A small-scale language model based on the BERT architecture, specifically designed for Kyrgyz natural language processing applications.
Large Language Model Transformers Other
K
metinovadilet
79
2
Erlangshen DeBERTa V2 320M Chinese
Apache-2.0
Chinese pre-trained language model based on DeBERTa-v2 architecture with 320 million parameters, excelling in natural language understanding tasks
Large Language Model Transformers Chinese
E
IDEA-CCNL
186
13
Roberta Pt Br
This is a pre-trained language model based on the RoBERTa architecture, specifically optimized for Brazilian Portuguese.
Large Language Model Transformers Other
R
josu
53
8
Bertin Base Random
This is a model based on the RoBERTa-base architecture, fully trained from scratch using Spanish data, specializing in masked language modeling tasks.
Large Language Model Spanish
B
bertin-project
19
0
Bert L12 H240 A12
A variant of the BERT model pre-trained using knowledge distillation technology, with a hidden layer dimension of 240 and 12 attention heads, suitable for masked language modeling tasks.
Large Language Model Transformers
B
eli4s
7
2
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase