Roberta TR Medium Bpe 44k
R
Roberta TR Medium Bpe 44k
Developed by ctoraman
A RoBERTa model based on Turkish, pre-trained with masked language modeling (MLM) objective, case-insensitive.
Downloads 48
Release Time : 3/9/2022
Model Overview
This model is a medium-scale RoBERTa model specifically optimized for Turkish, suitable for various natural language processing tasks.
Model Features
Turkish language optimization
Specifically pre-trained and optimized for Turkish
Medium-scale architecture
Adopts a medium-scale architecture with 8 layers, 8 heads, and 512 hidden dimensions
BPE tokenization
Uses BPE tokenization algorithm with a vocabulary size of 44.5k
Case-insensitive
The model is case-insensitive, suitable for Turkish language processing
Model Capabilities
Text understanding
Text classification
Sequence labeling
Language model fine-tuning
Use Cases
Natural language processing
Turkish text classification
Can be used for tasks such as sentiment analysis and topic classification of Turkish texts
Turkish named entity recognition
Can be used to identify entities such as person names and locations in Turkish texts
Featured Recommended AI Models