M

Modernbert Base Tr Uncased

Developed by artiwise-ai
Turkish pre-trained model based on ModernBERT architecture, supporting 8192 context length with excellent performance across multiple domains
Downloads 159
Release Time : 3/16/2025

Model Overview

This is the Turkish adaptation of ModernBERT, fine-tuned using the Turkish portion of CulturaX based on answerdotai/ModernBERT-base, optimized for Turkish text processing

Model Features

Extended Context Length
Supports 8192 context length, far exceeding the 512 limit of traditional BERT models
Multi-domain Optimization
Excellent performance in multiple domains including Q&A, reviews, and biomedical fields
Modern Architecture
Based on ModernBERT architecture with improved pre-training and fine-tuning capabilities

Model Capabilities

Turkish Text Understanding
Masked Language Modeling
Multi-domain Text Processing

Use Cases

Q&A Systems
Turkish Q&A
Used for building Turkish Q&A systems
Achieved 74.5% accuracy on Q&A datasets (5% masking ratio)
Sentiment Analysis
Product Review Analysis
Analyzing Turkish product reviews
Achieved 62.67% accuracy on review datasets (5% masking ratio)
Biomedical Text Processing
Medical Literature Analysis
Processing Turkish biomedical texts
Achieved 58.11% accuracy on biomedical datasets (5% masking ratio)
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase