S

Sec Bert Base

Developed by nlpaueb
SEC-BERT is a series of BERT models tailored for the financial domain, trained on 10-K annual reports from the U.S. Securities and Exchange Commission (SEC), supporting financial natural language processing research and fintech applications.
Downloads 749
Release Time : 3/2/2022

Model Overview

SEC-BERT is a BERT model specifically optimized for the financial domain, primarily used for natural language understanding tasks in financial documents, such as financial report analysis and financial entity recognition.

Model Features

Financial Domain Optimization
Trained specifically on 260,773 10-K annual reports, providing better understanding of financial terminology and expressions.
Multiple Variant Models
Offers three variants: base model (SEC-BERT), numeric processing model (SEC-BERT-NUM), and numeric shape model (SEC-BERT-SHAPE).
Financial Entity Recognition
Excels in financial numeric entity recognition tasks, such as percentages, monetary amounts, and years.

Model Capabilities

Financial Text Understanding
Financial Report Analysis
Financial Entity Recognition
Financial Numeric Processing

Use Cases

Financial Report Analysis
Sales Trend Analysis
Analyzing sales trend changes in annual reports.
Accurately predicts financial trend words such as 'growth' or 'decline'.
Financial Entity Recognition
Financial Numeric Identification
Identifying monetary amounts, percentages, and other numeric information in reports.
Higher accuracy in recognizing numbers and units compared to general BERT models.
Featured Recommended AI Models
ยฉ 2025AIbase