Scibert Scivocab Uncased Squad V2
A BERT-based pre-trained language model for the scientific domain, trained using a scientific literature vocabulary
Downloads 20
Release Time : 3/2/2022
Model Overview
SciBERT is a BERT model specifically pre-trained for scientific literature, utilizing a scientific domain vocabulary (SciVocab), making it suitable for natural language processing tasks related to scientific texts
Model Features
Scientific Domain Optimization
Trained with a specialized scientific literature vocabulary (SciVocab), delivering better performance for scientific text processing
SQuAD V2 Fine-tuning
Fine-tuned on the SQuAD V2 QA dataset, supporting the ability to determine whether a passage contains an answer
Efficient Training
Supports mixed-precision (FP16) training, enabling efficient training on 4 RTX 2080 Ti GPUs
Model Capabilities
Scientific Text Understanding
Question Answering
No-answer Detection
Text Span Prediction
Use Cases
Academic Research
Scientific Literature QA System
Automatically extracts answers to questions from scientific papers
Achieved an exact match score of 75.08 on the SQuAD V2 development set
Research Assistant
Helps researchers quickly locate specific information in literature
Educational Technology
Intelligent Learning System
Provides students with automated question-answering functionality based on scientific textbooks
Featured Recommended AI Models