B

Bart Large Japanese

Developed by ku-nlp
A Japanese BART large model pre-trained on Japanese Wikipedia, suitable for text generation and natural language processing tasks.
Downloads 206
Release Time : 5/9/2023

Model Overview

This is a Japanese BART large model pre-trained on Japanese Wikipedia, primarily used for text generation and natural language processing tasks.

Model Features

Japanese-specific pre-training
Specifically pre-trained for Japanese, optimizing Japanese text processing capabilities.
Based on Juman++ tokenization
Input text must be pre-tokenized using Juman++ to ensure processing accuracy.
Large-scale training data
Pre-trained using Japanese Wikipedia (18 million sentences).

Model Capabilities

Japanese text generation
Natural language processing
Text summarization
Machine translation

Use Cases

Academic research
Natural language processing research
Used for research and experiments related to Japanese natural language processing.
Text processing
Text summarization
Generate summaries of Japanese text.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase