# Large-scale Transformer

Nucleotide Transformer 500m Human Ref
A 500M-parameter Transformer model pre-trained on the human reference genome, integrating DNA sequence information from over 3,200 diverse human genomes and 850 species
Molecular Model Transformers
N
InstaDeepAI
4,482
12
Pythia 6.9b
Apache-2.0
Pythia-6.9B is a large-scale language model developed by EleutherAI, part of the Pythia scalable suite, specifically designed to facilitate interpretability research.
Large Language Model Transformers English
P
EleutherAI
46.72k
54
M Ctc T Large
Apache-2.0
A large-scale multilingual speech recognition model introduced by Meta AI, supporting 60 languages, based on a 1-billion-parameter Transformer encoder architecture.
Speech Recognition Transformers English
M
speechbrain
88
20
Bloom 1b7
Openrail
BigScience Large Open-science Multilingual Language Model, supporting 46 natural languages and 12 programming languages
Large Language Model Supports Multiple Languages
B
bigscience
105.70k
121
Mctct Large
Apache-2.0
A large-scale multilingual speech recognition model introduced by Meta AI, featuring 1 billion parameters and supporting character-level transcription for 60 languages
Speech Recognition Transformers English
M
cwkeam
21
0
Opus Mt Tc Big Hu En
This is a neural machine translation model for translating from Hungarian to English, part of the OPUS-MT project.
Machine Translation Transformers Supports Multiple Languages
O
Helsinki-NLP
371
3
Opus Mt Tc Big Bg En
A neural machine translation model for translating from Bulgarian to English, developed based on the OPUS-MT project.
Machine Translation Transformers Supports Multiple Languages
O
Helsinki-NLP
69
3
Opus Mt Tc Big En Lt
This is a neural machine translation model for English to Lithuanian translation, part of the OPUS-MT project.
Machine Translation Transformers Supports Multiple Languages
O
Helsinki-NLP
204
2
Opus Mt Tc Big En Zle
This is a neural machine translation model from English to East Slavic languages (including Belarusian, Russian, Ukrainian), part of the OPUS-MT project.
Machine Translation Transformers Supports Multiple Languages
O
Helsinki-NLP
565
0
Opus Mt Tc Big En Fi
This is a large-scale neural machine translation model based on the Transformer architecture, specifically designed for translating English to Finnish. The model is part of the OPUS-MT project, trained using the Marian NMT framework, and provided through Hugging Face's transformers library.
Machine Translation Transformers Supports Multiple Languages
O
Helsinki-NLP
1,255
2
Xlm Mlm 17 1280
The XLM model is a cross-lingual pretrained model based on text in 17 languages, using the masked language modeling (MLM) objective
Large Language Model Transformers Supports Multiple Languages
X
FacebookAI
201
2
Xlm Mlm 100 1280
The XLM model is a cross-lingual language model pre-trained on Wikipedia texts in 100 languages using masked language modeling objectives.
Large Language Model Transformers Supports Multiple Languages
X
FacebookAI
296
4
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase