R

Radbert RoBERTa 4m

Developed by zzxslp
RadBERT-RoBERTa-4m is a medical domain language model based on the RoBERTa architecture, trained with 4 million de-identified medical reports, and performs excellently in medical language understanding tasks.
Downloads 9,982
Release Time : 10/18/2022

Model Overview

This model is specifically designed for radiology reports and can perform tasks such as abnormal sentence classification, report coding, and report summarization, showing outstanding performance in the field of medical natural language processing.

Model Features

Training with large-scale medical data
Trained with 4 million de-identified medical reports, having rich medical knowledge
Optimization for professional domains
Optimized specifically for radiology reports, superior to general models in medical language understanding
Multi-task support
Supports multiple radiology report processing tasks such as anomaly detection, coding, and summarization

Model Capabilities

Medical text understanding
Anomaly detection
Diagnostic coding
Report summarization

Use Cases

Medical report processing
Anomaly detection in radiology reports
Automatically identify abnormal findings in radiology reports
Papers show excellent performance in the abnormal sentence classification task
Automation of diagnostic coding
Automatically assign diagnostic codes to radiology reports
Supports five different coding systems
Report summary generation
Extract key sentences from the examination results section
Can effectively summarize examination results
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase