Finbert Pretrain
FinBERT is a BERT model pretrained on financial communication texts, specifically designed for financial natural language processing tasks.
Downloads 23
Release Time : 7/21/2022
Model Overview
FinBERT is a BERT model pretrained on financial domain texts, aimed at advancing research and practice in financial natural language processing. It is particularly suitable for financial text analysis and comprehension tasks.
Model Features
Financial Domain Pretraining
Pretrained on financial communication texts (including annual reports, earnings call transcripts, and analyst reports), with a total corpus size of 4.9 billion tokens.
Efficient Training
Trained using NVIDIA DGX-1 servers and the Horovod framework for multi-GPU training, with full pretraining of a single model taking only about 2 days.
Downstream Task Adaptation
Can be fine-tuned for specific financial NLP tasks, such as analyst sentiment classification.
Model Capabilities
Financial Text Understanding
Masked Token Prediction
Sentiment Analysis
Text Classification
Use Cases
Financial Analysis
Analyst Sentiment Classification
Used to analyze sentiment tendencies in financial analyst reports.
Fine-tuned models are available on Hugging Face.
Financial Text Completion
Predicts missing words or phrases in financial texts.
As shown in the example, it can accurately predict professional terms in financial texts.
Featured Recommended AI Models