# Bioinformatics
Gobert
MIT
GoBERT is a model specifically designed for general gene function prediction. It can effectively capture the relationships between Gene Ontology (GO) functions by leveraging gene ontology graph information.
Protein Model
Safetensors
G
MM-YY-WW
479
1
Rnabert
RNABERT is a pre-trained model based on non-coding RNA (ncRNA), employing Masked Language Modeling (MLM) and Structural Alignment Learning (SAL) objectives.
Molecular Model Other
R
multimolecule
8,166
4
Rinalmo
RiNALMo is a non-coding RNA (ncRNA) model pre-trained based on the masked language modeling (MLM) objective, trained through self-supervised learning on a large number of non-coding RNA sequences.
Protein Model Other
R
multimolecule
21.38k
2
Birna Bert
A Transformer encoder model based on BERT architecture, specifically designed for generating RNA sequence embeddings
Text Embedding
Transformers

B
buetnlpbio
364
1
Bert Protein Classifier
This model is fine-tuned based on Bert-Base-Uncased and is used for the multi-label classification task of predicting protein functions based on protein amino acid sequences.
Protein Model
Transformers

B
oohtmeel
1,772
1
Progen2 Small
Bsd-3-clause
ProGen2-small is a mirrored version of the protein generation model based on Nijkamp et al.'s research, with slight adjustments to the configuration and forward propagation process.
Large Language Model
Transformers

P
hugohrban
6,505
2
Protein Matryoshka Embeddings
CC
This model generates embedding vectors for protein sequences, supporting shortened embeddings to accelerate search tasks.
Protein Model
Transformers

P
monsoon-nlp
2,121
7
Esm2 T6 8M UR50D Sequence Classifier V1
A sequence classifier based on the ESM-2 protein language model, designed for zero-shot classification tasks of protein sequences.
Text Classification
Transformers

E
Xenova
25
0
Nucleotide Transformer V2 50m Multi Species
The Nucleotide Transformer is a set of foundational language models pre-trained on whole-genome DNA sequences, integrating genomic data from over 3,200 human genomes and 850 diverse species.
Molecular Model
Transformers

N
InstaDeepAI
18.72k
3
Esm1b T33 650M UR50S
MIT
ESM-1b is a Transformer-based protein language model trained via unsupervised learning on protein sequence data, capable of predicting protein structure and function.
Protein Model
Transformers

E
facebook
24.20k
18
Esm2 T33 650M UR50D
MIT
ESM-2 is a state-of-the-art protein model trained on masked language modeling objectives, suitable for protein sequence analysis and prediction tasks
Protein Model
Transformers

E
facebook
640.23k
41
Tcr Bert Mlm Only
TCR-BERT is a pre-trained model based on the BERT architecture, specifically optimized for T-cell receptor (TCR) sequences through masked amino acid modeling tasks.
Protein Model
Transformers

T
wukevin
27
4
Featured Recommended AI Models