S

Scholarbert

Developed by globuslabs
BERT-large variant pretrained on large-scale scientific paper collections with 340 million parameters, specializing in scientific literature comprehension
Downloads 25
Release Time : 5/22/2022

Model Overview

ScholarBERT_100 is a language model pretrained on 221 billion tokens of scientific literature using BERT-large architecture, suitable for scientific text processing tasks

Model Features

Scientific Literature Optimization
Specifically pretrained for scientific literature, covering multidisciplinary fields including arts & humanities, life sciences, physical sciences, etc.
Large-scale Training
Trained on an ultra-large-scale scientific literature dataset of 221 billion tokens
Case-sensitive
Preserves original text case information, particularly important for scientific term recognition

Model Capabilities

Scientific text understanding
Academic literature analysis
Multidisciplinary knowledge processing

Use Cases

Academic Research
Literature Review Generation
Automatically analyze numerous research papers and generate field reviews
Scientific Term Recognition
Accurately identify specialized terms and concepts in research literature
Educational Technology
Intelligent Academic Writing Assistance
Help students and researchers improve academic writing
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase