AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Uzbek pre-trained

# Uzbek pre-trained

Uztext 568Mb Roberta BPE
UzRoBerta is a pre-trained Uzbek (Cyrillic script) model for masked language modeling and next sentence prediction.
Large Language Model Transformers
U
rifkat
24
0
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase