B

Bart Base Japanese

Developed by ku-nlp
This is a Japanese BART base model pretrained on Japanese Wikipedia, suitable for natural language processing tasks.
Downloads 181
Release Time : 5/9/2023

Model Overview

This model is a BART base model pretrained on Japanese Wikipedia, primarily used for Japanese text generation and natural language processing tasks.

Model Features

Japanese-specific pretraining
The model is specifically pretrained on Japanese text, optimizing performance for Japanese natural language processing tasks.
Juman++ tokenization
Input text must be pre-tokenized using Juman++ to ensure efficient processing of Japanese text.
Multi-GPU training
The model was trained for 2 weeks on 4 Tesla V100 GPUs with distributed training, ensuring high performance.

Model Capabilities

Japanese text generation
Natural language processing
Text summarization
Machine translation

Use Cases

Natural language processing
Text summarization
Generate summaries of Japanese text.
Machine translation
Used for translation tasks between Japanese and other languages.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase