C

Chrono Bert V1 19991231

Developed by manelalab
ChronoBERT is a series of high-performance temporally consistent large language models designed to eliminate look-ahead bias and training data leakage while maintaining strong language understanding capabilities in time-sensitive applications.
Downloads 167
Release Time : 2/28/2025

Model Overview

The model is pre-trained on diverse, high-quality, open-source, and timestamped texts to ensure temporal consistency. It outperforms standard BERT on the GLUE benchmark, supporting more reliable economic and financial modeling.

Model Features

Temporal Consistency
Eliminates look-ahead bias and training data leakage to ensure the integrity of historical analysis.
High Performance
Outperforms standard BERT on the GLUE benchmark while maintaining strong language understanding capabilities.
Diverse Pre-training Data
Pre-trained on 460 billion diverse, high-quality open-source text data from before the year 2000.
Incremental Updates
Updated annually from 2000 to 2024, adding 65 billion timestamped text data each year.

Model Capabilities

Language Understanding
Financial Forecasting
Time-Sensitive Analysis

Use Cases

Financial Modeling
Stock Return Prediction
Evaluated using return prediction tasks based on Dow Jones news data.
Achieves a Sharpe ratio of 4.80, outperforming BERT, FinBERT, and StoriesLM-v1-1963, and comparable to Llama 3.1 8B (4.90).
Natural Language Processing
GLUE Benchmark
Evaluates language understanding capabilities.
chrono-bert-v1-19991231 and chrono-bert-v1-20241231 scored 84.71 and 85.54, respectively, outperforming BERT (84.52).
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase