R

Rnaernie

Developed by multimolecule
RNAErnie is a model for self-supervised pre-training based on non-coding RNA sequences. It uses a multi-stage masked language modeling objective to provide powerful feature representation capabilities for RNA research.
Downloads 11.00k
Release Time : 5/16/2025

Model Overview

RNAErnie is a model based on the BERT architecture, specifically designed for non-coding RNA sequences. It learns the feature representation of RNA sequences through self-supervised pre-training and is suitable for a variety of RNA-related downstream tasks.

Model Features

Self-supervised pre-training
Conduct self-supervised pre-training on a large corpus of non-coding RNA sequences to learn powerful feature representation capabilities.
Multi-stage masking strategy
Adopt a special 3-stage training pipeline, including base-level masking, subsequence-level masking, and motif-level masking, to improve the generalization ability of the model.
Wide applicability
Can be used for a variety of downstream tasks, such as sequence classification, tag classification, contact classification, etc.

Model Capabilities

RNA sequence feature extraction
RNA sequence masking filling
RNA sequence classification
RNA tag classification
RNA contact prediction

Use Cases

Bioinformatics
RNA function prediction
Use the RNA sequence features extracted by the model to predict its biological functions.
RNA structure prediction
Predict the secondary or tertiary structure of RNA sequences through the model.
Medical research
Identification of disease-related RNA tags
Identify RNA sequence patterns or tags related to specific diseases.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase