A

Astrobert

Developed by adsabs
A language model specifically designed for astronomy and astrophysics, developed by the NASA/ADS team, supporting masked token filling, named entity recognition, and text classification tasks.
Downloads 215
Release Time : 6/28/2022

Model Overview

astroBERT is a domain-specific language model based on the BERT architecture, optimized for astrophysics literature, providing text embedding generation, masked token filling, and scientific literature classification capabilities.

Model Features

Domain-Specific Optimization
Pre-trained and fine-tuned on astrophysics literature for better understanding of professional terminology and concepts.
Multi-Task Support
The base model supports masked token filling, while derived models can perform named entity recognition and scientific literature classification.
Case-Sensitive Processing
Correctly handles case differences in professional terms (e.g., 'ads' vs. 'ADS').

Model Capabilities

Text Embedding Generation
Masked Language Modeling
Scientific Literature Classification
Astronomy Entity Recognition

Use Cases

Academic Research
Astrophysics Literature Analysis
Automatically extracts astrophysical entities and relationships from literature
Improves literature retrieval and knowledge mining efficiency
Scientific Literature Classification
Automatically classifies papers into subfields of astronomy
Supports classification into 7 scientific categories
Education
Astronomy Teaching Assistance
Generates explanations and examples of astronomical concepts
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase