Bert Small Japanese
B
Bert Small Japanese
Developed by izumi-lab
A small BERT model pre-trained on Japanese Wikipedia, optimized for financial text mining
Downloads 358
Release Time : 3/2/2022
Model Overview
This is a BERT model pre-trained on Japanese text, adopting the BERT small architecture from the ELECTRA paper, suitable for text analysis tasks in the financial domain
Model Features
Domain Optimization
Specifically optimized for financial text mining tasks
Efficient Architecture
Adopts BERT small architecture to balance performance and computational efficiency
Professional Tokenization
Uses MeCab tool (IPA dictionary) for Japanese tokenization
Model Capabilities
Japanese Text Understanding
Financial Text Analysis
Text Classification
Entity Recognition
Use Cases
Financial Analysis
Financial News Analysis
Analyze key information and trends in Japanese financial news
Financial Report Information Extraction
Extract key financial data and indicators from corporate reports
Academic Research
Financial Text Mining
Support natural language processing research in the financial domain
Featured Recommended AI Models
Š 2025AIbase