V

VBART Medium Base

Developed by vngrs-ai
VBART is the first large-scale sequence-to-sequence language model pretrained from scratch on Turkish corpora, developed by VNGRS.
Downloads 61
Release Time : 3/22/2024

Model Overview

VBART is a Transformer encoder-decoder model based on the mBART architecture, specifically pretrained for Turkish. After fine-tuning, the model can perform conditional text generation tasks such as summarization, paraphrasing, and headline generation.

Model Features

Turkish-specific model
The first sequence-to-sequence model pretrained from scratch on large-scale Turkish corpora
Efficient performance
Despite its smaller size, it outperforms multilingual counterparts
Large-scale pretraining
Pretrained on 63 billion tokens using high-quality filtered Turkish datasets

Model Capabilities

Text summarization
Text paraphrasing
Headline generation
Conditional text generation

Use Cases

Text processing
News summarization
Automatically generate concise summaries from long news articles
Content paraphrasing
Rephrase existing text to generate versions with different expressions
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase