# Large-scale Parameters
Perception LM 8B
Other
A pretrained language model based on the PyTorch framework released by Meta, suitable for non-commercial research purposes.
Large Language Model
PyTorch English
P
facebook
638
28
Nllb 200 Bnb 4bit
NLLB-200-3.3B is a multilingual neural machine translation model developed by Meta (formerly Facebook), supporting translation tasks between 200 languages.
Machine Translation
Transformers

N
Youseff1987
17
1
GNER T5 Xxl
Apache-2.0
GNER-T5-xxl is a generative named entity recognition model based on the Flan-T5 architecture, with 11B parameters, demonstrating excellent performance in zero-shot recognition tasks.
Sequence Labeling
Transformers English

G
dyyyyyyyy
51
3
Ziya LLaMA 13B Pretrain V1
Gpl-3.0
A large-scale pre-trained model with 13 billion parameters based on the LLaMa architecture, optimized for Chinese tokenization, completing 110 billion tokens of incremental pre-training in Chinese and English, significantly improving Chinese generation and comprehension capabilities
Large Language Model
Transformers Supports Multiple Languages

Z
IDEA-CCNL
113
20
T5 Efficient Large
Apache-2.0
T5-Efficient-LARGE is a variant based on Google's T5, optimized for downstream task performance with a deep narrow architecture, featuring 737.7 million parameters.
Large Language Model English
T
google
183
4
T5 Efficient Large Nh32
Apache-2.0
T5 Efficient Large-NH32 is a deep-narrow variant of Google's T5 model, focusing on improving downstream task performance by increasing model depth.
Large Language Model English
T
google
16
0
Featured Recommended AI Models