E

Electra Small Paper Japanese Generator

Developed by izumi-lab
A small ELECTRA model pretrained on Japanese Wikipedia, suitable for Japanese text generation and infilling tasks
Downloads 15
Release Time : 3/2/2022

Model Overview

This is an ELECTRA model pretrained on Japanese text, primarily used for Japanese text generation and infilling tasks, especially suitable for contextual understanding and generation of Japanese text

Model Features

Japanese-specific pretraining
Specifically pretrained on Japanese text using Japanese Wikipedia as training data
Compact and efficient architecture
Adopts a small ELECTRA architecture with low computational resource requirements, suitable for resource-constrained environments
Professional tokenization processing
Uses MeCab tool (IPA dictionary) for Japanese tokenization combined with WordPiece algorithm for subword segmentation

Model Capabilities

Japanese text generation
Text infilling (MASK task)
Japanese text understanding

Use Cases

Academic research
Academic text infilling
Filling missing content in academic texts, such as 'Conducting research on [MASK] at the University of Tokyo'
Text generation
Automatic Japanese text generation
Generating coherent Japanese text paragraphs
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase