AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
MLM pre-training

# MLM pre-training

Roberta TR Medium Wp 44k
A RoBERTa model for Turkish language, pre-trained with masked language modeling objective, case-insensitive, suitable for Turkish text processing tasks.
Large Language Model Transformers Other
R
ctoraman
84
0
Javabert
Apache-2.0
A BERT-like model pre-trained on Java software code, specifically designed for the Java programming language, supporting code fill-in-the-blank prediction tasks.
Large Language Model Transformers Other
J
CAUKiel
220
12
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase