B

Bert Small Japanese Fin

Developed by izumi-lab
This is a BERT model pre-trained on Japanese text, specifically optimized for the financial domain.
Downloads 4,446
Release Time : 3/2/2022

Model Overview

This model is a BERT model pre-trained on Japanese Wikipedia and financial domain corpora, suitable for financial text mining tasks.

Model Features

Domain-Specific Pre-training
Pre-trained using a combination of general Wikipedia corpus and financial domain-specific corpus
Efficient Architecture
Adopts a small BERT architecture to balance performance and efficiency
Professional Tokenization
Uses MeCab tool with IPA dictionary for professional tokenization

Model Capabilities

Japanese Text Understanding
Financial Text Analysis
Text Feature Extraction

Use Cases

Financial Analysis
Financial Report Analysis
Analyze company financial report summaries
Securities Report Processing
Parse and understand securities report content
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase