V

VBART Small Base

Developed by vngrs-ai
VBART is the first large-scale sequence-to-sequence language model pretrained from scratch on Turkish corpora, developed by VNGRS.
Downloads 92
Release Time : 3/22/2024

Model Overview

VBART is a Transformer encoder-decoder model based on the mBART architecture, primarily designed for Turkish text-to-text generation tasks such as summarization, rewriting, and headline generation.

Model Features

Turkish-specific model
The first sequence-to-sequence model pretrained from scratch on large-scale Turkish corpora.
Efficient performance
Despite its smaller size, it outperforms multilingual counterparts.
Pretrained foundation model
Capable of masked language modeling, suitable for downstream task fine-tuning.

Model Capabilities

Text summarization
Text rewriting
Headline generation
Masked language modeling

Use Cases

Text processing
Text summarization
Condense long texts into concise summaries.
Text rewriting
Paraphrase text or transform its style.
Headline generation
Generate appropriate headlines based on text content.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase