E

Esm1b T33 650M UR50S

Developed by facebook
ESM-1b is a Transformer-based protein language model trained via unsupervised learning on protein sequence data, capable of predicting protein structure and function.
Downloads 24.20k
Release Time : 10/17/2022

Model Overview

ESM-1b is a Transformer-based protein language model trained on unsupervised protein sequence data, learning universal protein features that can be transferred to downstream prediction tasks.

Model Features

Unsupervised Pretraining
Trained solely on raw protein sequences without requiring structural or functional labels.
Universal Feature Learning
Learns universal protein features through masked language modeling objectives, transferable to multiple downstream tasks.
Structural Inference Capability
Model attention heads directly reflect contact information in 3D protein structures.
Functional Impact Prediction
Can score the impact of sequence variations on protein function.

Model Capabilities

Protein sequence feature extraction
Protein structure prediction
Protein function prediction
Sequence variant impact scoring
Remote homology detection
Secondary structure prediction
Contact prediction

Use Cases

Biomedical Research
Protein Activity Prediction
Fit regression models on ESM-1b output features to predict activity of new sequences.
Mutation Functional Impact Analysis
Evaluate the functional impact of protein sequence variations.
Achieved state-of-the-art results in related tasks
Protein Engineering
Protein Design
Guide protein sequence design using model predictions.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase