R

Rnabert

Developed by multimolecule
RNABERT is a pre-trained model based on non-coding RNA (ncRNA), employing Masked Language Modeling (MLM) and Structural Alignment Learning (SAL) objectives.
Downloads 8,166
Release Time : 9/10/2024

Model Overview

RNABERT is a BERT-style model pre-trained on a large number of non-coding RNA sequences in a self-supervised manner, primarily used for RNA sequence feature extraction and structural alignment.

Model Features

Dual-objective pre-training
Simultaneously employs both Masked Language Modeling (MLM) and Structural Alignment Learning (SAL) pre-training objectives
RNA-specific model
Specifically designed and trained for non-coding RNA sequences
Lightweight architecture
Only 0.48M parameters, suitable for RNA sequence processing tasks

Model Capabilities

RNA sequence feature extraction
RNA structural alignment prediction
RNA sequence masked prediction

Use Cases

Bioinformatics
RNA functional clustering
Performs functional clustering analysis using RNA sequence features extracted by the model
RNA structural alignment
Predicts structural alignment relationships between two RNA sequences
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase