# MLM pre-training
Roberta TR Medium Wp 44k
A RoBERTa model for Turkish language, pre-trained with masked language modeling objective, case-insensitive, suitable for Turkish text processing tasks.
Large Language Model
Transformers Other

R
ctoraman
84
0
Javabert
Apache-2.0
A BERT-like model pre-trained on Java software code, specifically designed for the Java programming language, supporting code fill-in-the-blank prediction tasks.
Large Language Model
Transformers Other

J
CAUKiel
220
12
Featured Recommended AI Models